Problem Statement

Pneumonia is an infection in one or both lungs. Bacteria, viruses, and fungi cause it. The infection causes inflammation in the air sacs in your lungs, which are called alveoli. Pneumonia accounts for over 15% of all deaths of children under 5 years old internationally. In 2017, 920,000 children under the age of 5 died from the disease. It requires review of a chest radiograph (CXR) by highly trained specialists and confirmation through clinical history, vital signs and laboratory exams. Pneumonia usually manifests as an area or areas of increased opacity on CXR. However, the diagnosis of pneumonia on CXR is complicated because of a number of other conditions in the lungs such as fluid overload (pulmonary edema), bleeding, volume loss (atelectasis or collapse), lung cancer, or post- radiation or surgical changes. Outside of the lungs, fluid in the pleural space (pleural effusion) also appears as increased opacity on CXR. When available, comparison of CXRs of the patient taken at different time points and correlation with clinical symptoms and history are helpful in making the diagnosis. CXRs are the most commonly performed diagnostic imaging study. A number of factors such as positioning of the patient and depth of inspiration can alter the appearance of the CXR, complicating interpretation further. In addition, clinicians are faced with reading high volumes of images every shift. Pneumonia Detection Now to detection Pneumonia we need to detect Inflammation of the lungs. In this project, you’re challenged to build an algorithm to detect a visual signal for pneumonia in medical images. Specifically, your algorithm needs to automatically locate lung opacities on chest radiographs.

Objectives

The objective of the project is,

  • Learn to how to do build an Object Detection Model
  • Use transfer learning to fine-tune a model.
  • Learn to
  • Read different research papers of given domain to obtain the knowledge of advanced models for the given problem.

Import the Libraries

In [ ]:
pip install segmentation_models
Collecting segmentation_models
  Downloading https://files.pythonhosted.org/packages/da/b9/4a183518c21689a56b834eaaa45cad242d9ec09a4360b5b10139f23c63f4/segmentation_models-1.0.1-py3-none-any.whl
Collecting image-classifiers==1.0.0
  Downloading https://files.pythonhosted.org/packages/81/98/6f84720e299a4942ab80df5f76ab97b7828b24d1de5e9b2cbbe6073228b7/image_classifiers-1.0.0-py3-none-any.whl
Collecting efficientnet==1.0.0
  Downloading https://files.pythonhosted.org/packages/97/82/f3ae07316f0461417dc54affab6e86ab188a5a22f33176d35271628b96e0/efficientnet-1.0.0-py3-none-any.whl
Collecting keras-applications<=1.0.8,>=1.0.7
  Downloading https://files.pythonhosted.org/packages/71/e3/19762fdfc62877ae9102edf6342d71b28fbfd9dea3d2f96a882ce099b03f/Keras_Applications-1.0.8-py3-none-any.whl (50kB)
     |████████████████████████████████| 51kB 6.6MB/s 
Requirement already satisfied: scikit-image in /usr/local/lib/python3.6/dist-packages (from efficientnet==1.0.0->segmentation_models) (0.16.2)
Requirement already satisfied: numpy>=1.9.1 in /usr/local/lib/python3.6/dist-packages (from keras-applications<=1.0.8,>=1.0.7->segmentation_models) (1.19.4)
Requirement already satisfied: h5py in /usr/local/lib/python3.6/dist-packages (from keras-applications<=1.0.8,>=1.0.7->segmentation_models) (2.10.0)
Requirement already satisfied: networkx>=2.0 in /usr/local/lib/python3.6/dist-packages (from scikit-image->efficientnet==1.0.0->segmentation_models) (2.5)
Requirement already satisfied: PyWavelets>=0.4.0 in /usr/local/lib/python3.6/dist-packages (from scikit-image->efficientnet==1.0.0->segmentation_models) (1.1.1)
Requirement already satisfied: scipy>=0.19.0 in /usr/local/lib/python3.6/dist-packages (from scikit-image->efficientnet==1.0.0->segmentation_models) (1.4.1)
Requirement already satisfied: pillow>=4.3.0 in /usr/local/lib/python3.6/dist-packages (from scikit-image->efficientnet==1.0.0->segmentation_models) (7.0.0)
Requirement already satisfied: matplotlib!=3.0.0,>=2.0.0 in /usr/local/lib/python3.6/dist-packages (from scikit-image->efficientnet==1.0.0->segmentation_models) (3.2.2)
Requirement already satisfied: imageio>=2.3.0 in /usr/local/lib/python3.6/dist-packages (from scikit-image->efficientnet==1.0.0->segmentation_models) (2.4.1)
Requirement already satisfied: six in /usr/local/lib/python3.6/dist-packages (from h5py->keras-applications<=1.0.8,>=1.0.7->segmentation_models) (1.15.0)
Requirement already satisfied: decorator>=4.3.0 in /usr/local/lib/python3.6/dist-packages (from networkx>=2.0->scikit-image->efficientnet==1.0.0->segmentation_models) (4.4.2)
Requirement already satisfied: python-dateutil>=2.1 in /usr/local/lib/python3.6/dist-packages (from matplotlib!=3.0.0,>=2.0.0->scikit-image->efficientnet==1.0.0->segmentation_models) (2.8.1)
Requirement already satisfied: kiwisolver>=1.0.1 in /usr/local/lib/python3.6/dist-packages (from matplotlib!=3.0.0,>=2.0.0->scikit-image->efficientnet==1.0.0->segmentation_models) (1.3.1)
Requirement already satisfied: cycler>=0.10 in /usr/local/lib/python3.6/dist-packages (from matplotlib!=3.0.0,>=2.0.0->scikit-image->efficientnet==1.0.0->segmentation_models) (0.10.0)
Requirement already satisfied: pyparsing!=2.0.4,!=2.1.2,!=2.1.6,>=2.0.1 in /usr/local/lib/python3.6/dist-packages (from matplotlib!=3.0.0,>=2.0.0->scikit-image->efficientnet==1.0.0->segmentation_models) (2.4.7)
Installing collected packages: keras-applications, image-classifiers, efficientnet, segmentation-models
Successfully installed efficientnet-1.0.0 image-classifiers-1.0.0 keras-applications-1.0.8 segmentation-models-1.0.1
In [ ]:
pip install pydot
Requirement already satisfied: pydot in /usr/local/lib/python3.6/dist-packages (1.3.0)
Requirement already satisfied: pyparsing>=2.1.4 in /usr/local/lib/python3.6/dist-packages (from pydot) (2.4.7)
In [1]:
pip install pydicom
Collecting pydicom
  Downloading https://files.pythonhosted.org/packages/f4/15/df16546bc59bfca390cf072d473fb2c8acd4231636f64356593a63137e55/pydicom-2.1.2-py3-none-any.whl (1.9MB)
     |████████████████████████████████| 1.9MB 12.3MB/s 
Installing collected packages: pydicom
Successfully installed pydicom-2.1.2
In [ ]:
pip install tensorflow
In [2]:
import pandas as pd
import numpy as np
import matplotlib.pyplot as plt
import pylab as pl
import tensorflow as tf
import os
import pydicom
import cv2
import shutil
from glob import glob

from matplotlib.patches import Rectangle
import math
import tensorflow as tf
from tensorflow.keras.preprocessing.image import ImageDataGenerator
from tensorflow.keras.applications.mobilenet import preprocess_input

#Mobile Net
from tensorflow.keras.applications.mobilenet import MobileNet
from tensorflow.keras.layers import Reshape, UpSampling2D, Input, Concatenate, Conv2D,Dense,Activation, BatchNormalization, SpatialDropout2D
from tensorflow.keras.models import Model,Sequential
#import segmentation_models
#from segmentation_models.losses import bce_jaccard_loss
#from segmentation_models.metrics import iou_score

from tensorflow.keras.models import Model # to join the two models
from tensorflow.keras.callbacks import ModelCheckpoint, EarlyStopping, ReduceLROnPlateau
from sklearn.model_selection import train_test_split
from tensorflow.keras.optimizers import Adam

# to define loss
from tensorflow.keras.losses import binary_crossentropy
from tensorflow.keras.backend import log, epsilon

from tensorflow.keras.applications import VGG16
from tensorflow.keras.models import Model, Sequential
from tensorflow.keras.layers import Input, Dense, Flatten, Dropout, BatchNormalization
from tensorflow.keras.layers import Conv2D, SeparableConv2D, MaxPool2D, LeakyReLU, Activation
from tensorflow.keras.optimizers import Adam
from tensorflow.keras.preprocessing.image import ImageDataGenerator
from tensorflow.keras.callbacks import ModelCheckpoint, ReduceLROnPlateau, EarlyStopping
from tensorflow.keras.optimizers import Adam,SGD, RMSprop
from tensorflow.keras.losses import binary_crossentropy
from tqdm import tqdm_notebook
from tqdm import tqdm
import seaborn as sns
import math
import random
from keras.utils.vis_utils import plot_model
from sklearn.utils import resample


from sklearn.metrics import roc_auc_score, roc_curve, classification_report, confusion_matrix
from tensorflow.keras.callbacks import CSVLogger, ModelCheckpoint, ReduceLROnPlateau
from tensorflow.keras.layers import Dense, Dropout, Input, GlobalAveragePooling2D
from tensorflow.keras.applications.densenet import preprocess_input
from tensorflow.keras.applications import DenseNet121
from tensorflow.keras.applications import ResNet50
from tensorflow.keras.applications import InceptionV3
from tensorflow.keras.applications import DenseNet169
from tensorflow.keras.applications import VGG19
#from skimage.transform import resize
import keras
import math
random_state = 2020


# Ignore the warnings
import warnings
warnings.filterwarnings("ignore")
In [ ]:
print(tf.__version__)
2.4.0

Image Properties Below

Change to the current project directory

In [3]:
from google.colab import drive
drive.mount("/content/gdrive/")
Mounted at /content/gdrive/
In [4]:
os.chdir('/content/gdrive/MyDrive/Colab_Notebooks/Capstone_Project')
In [ ]:
#os.chdir('/Volumes/Ayon_Drive/GreatLearning/Capstone_Pneumonia/')

Name of training and test images and bounding box details below

In [5]:
DET_CLASS_INFO = 'stage_2_detailed_class_info.csv'
TRAIN_BBOX = 'stage_2_train_labels.csv'
TRAIN_IMG_DCM = "stage_2_train_images"
TEST_IMG_DCM = "stage_2_test_images"
TRAIN_IMG_DIR_JPG = 'JPG_train'
TEST_IMG_DIR_JPG = 'JPG_test'
nbrImages = 1# Swathi-Changed this to 1
TRAIN_IMG_AUG_JPG = 'JPG_train_aug'

Pre-Processing, Data Visualization, EDA

Exploratory Data Analysis (EDA) Here as a part of EDA, we will:

Start with understanding of the data with a brief on train/test labels and respective class info Look at the first five rows of both the csvs (train and test) Identify how are classes and target distributed Check the number of patients with 1, 2, ... bounding boxes Read and extract metadata from dicom files Perform analysis on some of the features from dicom files Check some random images from the training dataset Draw insights from the data at various stages of EDA

Reading CSVs Images for the current stage in the stage_2_train_images and stage_2_test_images. Training data: stage_2_train_labels.csv stage_2_detailed_class_info.csv containing detailed information about the positive and negative classes in the training set

Loading detailed class info file

In [6]:
class_df = pd.read_csv(DET_CLASS_INFO)
In [7]:
print("\nClass dataframe has 30227 rows and 2 columns:")
class_df.shape
Class dataframe has 30227 rows and 2 columns:
Out[7]:
(30227, 2)
In [8]:
print("\nClass dataframe first 5 rows:")
class_df.head()
Class dataframe first 5 rows:
Out[8]:
patientId class
0 0004cfab-14fd-4e49-80ba-63a80b6bddd6 No Lung Opacity / Not Normal
1 00313ee0-9eaa-42f4-b0ab-c148ed3241cd No Lung Opacity / Not Normal
2 00322d4d-1c29-4943-afc9-b6754be640eb No Lung Opacity / Not Normal
3 003d8fa0-6bf1-40ed-b54c-ac657f8495c5 Normal
4 00436515-870c-4b36-a041-de91049b9ab4 Lung Opacity
In [9]:
print('Total No of Patients in Class Info', class_df['patientId'].value_counts().shape[0])
Total No of Patients in Class Info 26684
In [10]:
print('Total distinct classes: ', class_df['class'].unique())
Total distinct classes:  ['No Lung Opacity / Not Normal' 'Normal' 'Lung Opacity']

We see there are 3 classes, Normal, Lung Opacity and No Lung Opacity/Not Normal

No Lung Opacity/Not Normal are cases that look like opacity but are not.

Check for duplicates in patient id

In [11]:
##Identify duplicates records in the data
dupes = class_df['patientId'].duplicated()
sum(dupes)
Out[11]:
3543

3543 patients have duplicates in class info

In [12]:
class_df.groupby('class').size().plot.bar(5, 10, color=['Orange', 'green', 'Indigo'])
Out[12]:
<matplotlib.axes._subplots.AxesSubplot at 0x7eff9b93d1d0>

Load CSV file containing training set patientIds and labels (Bounding Boxes)

In [13]:
labels_df = pd.read_csv(TRAIN_BBOX)
In [14]:
print("\nLabel dataframe first 5 rows:")
labels_df.head()
Label dataframe first 5 rows:
Out[14]:
patientId x y width height Target
0 0004cfab-14fd-4e49-80ba-63a80b6bddd6 NaN NaN NaN NaN 0
1 00313ee0-9eaa-42f4-b0ab-c148ed3241cd NaN NaN NaN NaN 0
2 00322d4d-1c29-4943-afc9-b6754be640eb NaN NaN NaN NaN 0
3 003d8fa0-6bf1-40ed-b54c-ac657f8495c5 NaN NaN NaN NaN 0
4 00436515-870c-4b36-a041-de91049b9ab4 264.0 152.0 213.0 379.0 1

We see patient ids, and bounding box is present in the dataset. 0 means No Pneumonia, 1 means Pneumonia

Bounding Box is not present when the patient doesnot have pneumonia, however 0 can mean No Lung Opacity/Not Normal¶

In [15]:
print(f'Train Labels dataframe has {labels_df.shape[0]} rows and {labels_df.shape[1]} columns')
Train Labels dataframe has 30227 rows and 6 columns

Converting Not a number to 0

In [16]:
labels_df['x'] = labels_df['x'].replace(np.nan, 0)
labels_df['y'] = labels_df['y'].replace(np.nan, 0)
labels_df['width'] = labels_df['width'].replace(np.nan, 0)
labels_df['height'] = labels_df['height'].replace(np.nan, 0)
In [17]:
print("\nUpdated data samples:")
labels_df.head()
Updated data samples:
Out[17]:
patientId x y width height Target
0 0004cfab-14fd-4e49-80ba-63a80b6bddd6 0.0 0.0 0.0 0.0 0
1 00313ee0-9eaa-42f4-b0ab-c148ed3241cd 0.0 0.0 0.0 0.0 0
2 00322d4d-1c29-4943-afc9-b6754be640eb 0.0 0.0 0.0 0.0 0
3 003d8fa0-6bf1-40ed-b54c-ac657f8495c5 0.0 0.0 0.0 0.0 0
4 00436515-870c-4b36-a041-de91049b9ab4 264.0 152.0 213.0 379.0 1

There are 30,227 patient ids

Checking the dataset information

In [18]:
labels_df.info()
<class 'pandas.core.frame.DataFrame'>
RangeIndex: 30227 entries, 0 to 30226
Data columns (total 6 columns):
 #   Column     Non-Null Count  Dtype  
---  ------     --------------  -----  
 0   patientId  30227 non-null  object 
 1   x          30227 non-null  float64
 2   y          30227 non-null  float64
 3   width      30227 non-null  float64
 4   height     30227 non-null  float64
 5   Target     30227 non-null  int64  
dtypes: float64(4), int64(1), object(1)
memory usage: 1.4+ MB
In [19]:
labels_df.describe()
Out[19]:
x y width height Target
count 30227.000000 30227.000000 30227.000000 30227.000000 30227.000000
mean 124.561683 115.960962 69.060575 104.084825 0.316108
std 216.326397 190.012883 106.910496 176.932152 0.464963
min 0.000000 0.000000 0.000000 0.000000 0.000000
25% 0.000000 0.000000 0.000000 0.000000 0.000000
50% 0.000000 0.000000 0.000000 0.000000 0.000000
75% 193.000000 231.000000 169.000000 188.000000 1.000000
max 835.000000 881.000000 528.000000 942.000000 1.000000

Checking duplicates if it matches with class info file

In [20]:
##Identify duplicates records in the data
dupes = labels_df['patientId'].duplicated()
sum(dupes)
Out[20]:
3543

3543 patients have multiple X rays

Check for missing values

In [21]:
print(" \nCount total NaN at each column in the dataset : \n\n", 
      labels_df.isnull().sum())
 
Count total NaN at each column in the dataset : 

 patientId    0
x            0
y            0
width        0
height       0
Target       0
dtype: int64

From the above we see there are no null values

In [22]:
print('Lets check the distribution of `Target` and `class` column'); print('--'*40)
fig = plt.figure(figsize = (10, 6))
ax = fig.add_subplot(121)
g = (labels_df['Target'].value_counts()
    .plot(kind = 'pie', autopct = '%.0f%%', 
          labels = ['Negative', 'Pneumonia Evidence'], 
          colors = ['green', 'red'], 
          startangle = 90, 
          title = 'Distribution of Target', fontsize = 12)
    .set_ylabel(''))
ax = fig.add_subplot(122)
g = (class_df['class'].value_counts().sort_index(ascending = False)
    .plot(kind = 'pie', autopct = '%.0f%%', 
          colors = ['green', 'orange', 'red'], 
          startangle = 90, title = 'Distribution of Class', 
          fontsize = 12)
    .set_ylabel(''))
plt.tight_layout()
Lets check the distribution of `Target` and `class` column
--------------------------------------------------------------------------------

We will check the number of boxes for each patient

In [23]:
box_patient_df = labels_df.groupby('patientId').size().reset_index(name='boxes')
box_patient_df.groupby('boxes').size().reset_index(name='patients')
Out[23]:
boxes patients
0 1 23286
1 2 3266
2 3 119
3 4 13
In [24]:
labels_class_df = pd.merge(labels_df, class_df, how='inner', on='patientId')
print('Total Cases : ', labels_class_df.shape[0])
Total Cases :  37629

Since there are duplicate patient ids in both the datasets we see an increase in the number of rows

Instead of doing inner join lets try to concat both the datasets

In [25]:
print('Let\'s also check whether each patientId has only one type of class'); print('--'*40)
print('Yes, each patientId is associated with only {} class'.format(class_df.groupby(['patientId'])['class'].nunique().max()))

# Merge the two dataframes
train_class_df = pd.concat([labels_df, class_df['class']], axis = 1)
print('Shape of the dataset after the merge: {}'.format(train_class_df.shape))
Let's also check whether each patientId has only one type of class
--------------------------------------------------------------------------------
Yes, each patientId is associated with only 1 class
Shape of the dataset after the merge: (30227, 7)
In [26]:
train_class_df.head(5)
Out[26]:
patientId x y width height Target class
0 0004cfab-14fd-4e49-80ba-63a80b6bddd6 0.0 0.0 0.0 0.0 0 No Lung Opacity / Not Normal
1 00313ee0-9eaa-42f4-b0ab-c148ed3241cd 0.0 0.0 0.0 0.0 0 No Lung Opacity / Not Normal
2 00322d4d-1c29-4943-afc9-b6754be640eb 0.0 0.0 0.0 0.0 0 No Lung Opacity / Not Normal
3 003d8fa0-6bf1-40ed-b54c-ac657f8495c5 0.0 0.0 0.0 0.0 0 Normal
4 00436515-870c-4b36-a041-de91049b9ab4 264.0 152.0 213.0 379.0 1 Lung Opacity

Observations from the CSVs Based on analysis above, some of the observations:

Training data is having a set of patientIds and bounding boxes. Bounding boxes are defined as follows: x, y, width and height. There are multiple records for patients. Number of duplicates in patientID = 3,543. There is also a binary target column i.e. Target indicating there was evidence of pneumonia or no definitive evidence of pneumonia. Class label contains: No Lung Opacity/Not Normal, Normal and Lung Opacity. Chest examinations with Target = 1 i.e. ones with evidence of Pneumonia are associated with Lung Opacity class. Chest examinations with Target = 0 i.e. those with no definitive effidence of Pneumonia are either of Normal or No Lung Opacity / Not Normal class. About 23,286 patientIds (~87% of them) provided have 1 bounding boxes while 13 patients have 4 bounding boxes!!!!

Reading Images Images provided are stored in DICOM (.dcm) format which is an international standard to transmit, store, retrieve, print, process, and display medical imaging information. Digital Imaging and Communications in Medicine (DICOM) makes medical imaging information interoperable. We will make use of pydicom package here to read the images.

In [27]:
def checkXray(i, dirName):
    patientId = train_class_df['patientId'][i]
    print("Patient Id: ", patientId)
    fileName = dirName + "/" + patientId
    print("\nBounding Box Coordinates, X: ", train_class_df['x'][i])
    print("\nBounding Box Coordinates, Y: ", train_class_df['y'][i])
    print("\nBounding Box Coordinates, Width: ", train_class_df['width'][i])
    print("\nBounding Box Coordinates, Height: ", train_class_df['height'][i])
    
    patient_file = '%s.dcm' % fileName
    patient_data = pydicom.read_file(patient_file)
    print(patient_data)
    
    plt.imshow(patient_data.pixel_array,cmap=pl.cm.gist_gray)

Let's take a look into an image of a person who has normal lungs

In [30]:
checkXray(3, TRAIN_IMG_DCM)
Patient Id:  003d8fa0-6bf1-40ed-b54c-ac657f8495c5

Bounding Box Coordinates, X:  0.0

Bounding Box Coordinates, Y:  0.0

Bounding Box Coordinates, Width:  0.0

Bounding Box Coordinates, Height:  0.0
Dataset.file_meta -------------------------------
(0002, 0000) File Meta Information Group Length  UL: 200
(0002, 0001) File Meta Information Version       OB: b'\x00\x01'
(0002, 0002) Media Storage SOP Class UID         UI: Secondary Capture Image Storage
(0002, 0003) Media Storage SOP Instance UID      UI: 1.2.276.0.7230010.3.1.4.8323329.2293.1517874295.733882
(0002, 0010) Transfer Syntax UID                 UI: JPEG Baseline (Process 1)
(0002, 0012) Implementation Class UID            UI: 1.2.276.0.7230010.3.0.3.6.0
(0002, 0013) Implementation Version Name         SH: 'OFFIS_DCMTK_360'
-------------------------------------------------
(0008, 0005) Specific Character Set              CS: 'ISO_IR 100'
(0008, 0016) SOP Class UID                       UI: Secondary Capture Image Storage
(0008, 0018) SOP Instance UID                    UI: 1.2.276.0.7230010.3.1.4.8323329.2293.1517874295.733882
(0008, 0020) Study Date                          DA: '19010101'
(0008, 0030) Study Time                          TM: '000000.00'
(0008, 0050) Accession Number                    SH: ''
(0008, 0060) Modality                            CS: 'CR'
(0008, 0064) Conversion Type                     CS: 'WSD'
(0008, 0090) Referring Physician's Name          PN: ''
(0008, 103e) Series Description                  LO: 'view: PA'
(0010, 0010) Patient's Name                      PN: '003d8fa0-6bf1-40ed-b54c-ac657f8495c5'
(0010, 0020) Patient ID                          LO: '003d8fa0-6bf1-40ed-b54c-ac657f8495c5'
(0010, 0030) Patient's Birth Date                DA: ''
(0010, 0040) Patient's Sex                       CS: 'M'
(0010, 1010) Patient's Age                       AS: '28'
(0018, 0015) Body Part Examined                  CS: 'CHEST'
(0018, 5101) View Position                       CS: 'PA'
(0020, 000d) Study Instance UID                  UI: 1.2.276.0.7230010.3.1.2.8323329.2293.1517874295.733881
(0020, 000e) Series Instance UID                 UI: 1.2.276.0.7230010.3.1.3.8323329.2293.1517874295.733880
(0020, 0010) Study ID                            SH: ''
(0020, 0011) Series Number                       IS: "1"
(0020, 0013) Instance Number                     IS: "1"
(0020, 0020) Patient Orientation                 CS: ''
(0028, 0002) Samples per Pixel                   US: 1
(0028, 0004) Photometric Interpretation          CS: 'MONOCHROME2'
(0028, 0010) Rows                                US: 1024
(0028, 0011) Columns                             US: 1024
(0028, 0030) Pixel Spacing                       DS: [0.14300000000000002, 0.14300000000000002]
(0028, 0100) Bits Allocated                      US: 8
(0028, 0101) Bits Stored                         US: 8
(0028, 0102) High Bit                            US: 7
(0028, 0103) Pixel Representation                US: 0
(0028, 2110) Lossy Image Compression             CS: '01'
(0028, 2114) Lossy Image Compression Method      CS: 'ISO_10918_1'
(7fe0, 0010) Pixel Data                          OB: Array of 155284 elements

Let's take a look into an image of a person who has lung opacity

In [31]:
checkXray(4, TRAIN_IMG_DCM)
Patient Id:  00436515-870c-4b36-a041-de91049b9ab4

Bounding Box Coordinates, X:  264.0

Bounding Box Coordinates, Y:  152.0

Bounding Box Coordinates, Width:  213.0

Bounding Box Coordinates, Height:  379.0
Dataset.file_meta -------------------------------
(0002, 0000) File Meta Information Group Length  UL: 200
(0002, 0001) File Meta Information Version       OB: b'\x00\x01'
(0002, 0002) Media Storage SOP Class UID         UI: Secondary Capture Image Storage
(0002, 0003) Media Storage SOP Instance UID      UI: 1.2.276.0.7230010.3.1.4.8323329.6379.1517874325.469569
(0002, 0010) Transfer Syntax UID                 UI: JPEG Baseline (Process 1)
(0002, 0012) Implementation Class UID            UI: 1.2.276.0.7230010.3.0.3.6.0
(0002, 0013) Implementation Version Name         SH: 'OFFIS_DCMTK_360'
-------------------------------------------------
(0008, 0005) Specific Character Set              CS: 'ISO_IR 100'
(0008, 0016) SOP Class UID                       UI: Secondary Capture Image Storage
(0008, 0018) SOP Instance UID                    UI: 1.2.276.0.7230010.3.1.4.8323329.6379.1517874325.469569
(0008, 0020) Study Date                          DA: '19010101'
(0008, 0030) Study Time                          TM: '000000.00'
(0008, 0050) Accession Number                    SH: ''
(0008, 0060) Modality                            CS: 'CR'
(0008, 0064) Conversion Type                     CS: 'WSD'
(0008, 0090) Referring Physician's Name          PN: ''
(0008, 103e) Series Description                  LO: 'view: AP'
(0010, 0010) Patient's Name                      PN: '00436515-870c-4b36-a041-de91049b9ab4'
(0010, 0020) Patient ID                          LO: '00436515-870c-4b36-a041-de91049b9ab4'
(0010, 0030) Patient's Birth Date                DA: ''
(0010, 0040) Patient's Sex                       CS: 'F'
(0010, 1010) Patient's Age                       AS: '32'
(0018, 0015) Body Part Examined                  CS: 'CHEST'
(0018, 5101) View Position                       CS: 'AP'
(0020, 000d) Study Instance UID                  UI: 1.2.276.0.7230010.3.1.2.8323329.6379.1517874325.469568
(0020, 000e) Series Instance UID                 UI: 1.2.276.0.7230010.3.1.3.8323329.6379.1517874325.469567
(0020, 0010) Study ID                            SH: ''
(0020, 0011) Series Number                       IS: "1"
(0020, 0013) Instance Number                     IS: "1"
(0020, 0020) Patient Orientation                 CS: ''
(0028, 0002) Samples per Pixel                   US: 1
(0028, 0004) Photometric Interpretation          CS: 'MONOCHROME2'
(0028, 0010) Rows                                US: 1024
(0028, 0011) Columns                             US: 1024
(0028, 0030) Pixel Spacing                       DS: [0.139, 0.139]
(0028, 0100) Bits Allocated                      US: 8
(0028, 0101) Bits Stored                         US: 8
(0028, 0102) High Bit                            US: 7
(0028, 0103) Pixel Representation                US: 0
(0028, 2110) Lossy Image Compression             CS: '01'
(0028, 2114) Lossy Image Compression Method      CS: 'ISO_10918_1'
(7fe0, 0010) Pixel Data                          OB: Array of 119382 elements

Let's take a look into an image of a person who has No Lung Opacity/Not Normal¶

In [32]:
checkXray(0, TRAIN_IMG_DCM)
Patient Id:  0004cfab-14fd-4e49-80ba-63a80b6bddd6

Bounding Box Coordinates, X:  0.0

Bounding Box Coordinates, Y:  0.0

Bounding Box Coordinates, Width:  0.0

Bounding Box Coordinates, Height:  0.0
Dataset.file_meta -------------------------------
(0002, 0000) File Meta Information Group Length  UL: 202
(0002, 0001) File Meta Information Version       OB: b'\x00\x01'
(0002, 0002) Media Storage SOP Class UID         UI: Secondary Capture Image Storage
(0002, 0003) Media Storage SOP Instance UID      UI: 1.2.276.0.7230010.3.1.4.8323329.28530.1517874485.775526
(0002, 0010) Transfer Syntax UID                 UI: JPEG Baseline (Process 1)
(0002, 0012) Implementation Class UID            UI: 1.2.276.0.7230010.3.0.3.6.0
(0002, 0013) Implementation Version Name         SH: 'OFFIS_DCMTK_360'
-------------------------------------------------
(0008, 0005) Specific Character Set              CS: 'ISO_IR 100'
(0008, 0016) SOP Class UID                       UI: Secondary Capture Image Storage
(0008, 0018) SOP Instance UID                    UI: 1.2.276.0.7230010.3.1.4.8323329.28530.1517874485.775526
(0008, 0020) Study Date                          DA: '19010101'
(0008, 0030) Study Time                          TM: '000000.00'
(0008, 0050) Accession Number                    SH: ''
(0008, 0060) Modality                            CS: 'CR'
(0008, 0064) Conversion Type                     CS: 'WSD'
(0008, 0090) Referring Physician's Name          PN: ''
(0008, 103e) Series Description                  LO: 'view: PA'
(0010, 0010) Patient's Name                      PN: '0004cfab-14fd-4e49-80ba-63a80b6bddd6'
(0010, 0020) Patient ID                          LO: '0004cfab-14fd-4e49-80ba-63a80b6bddd6'
(0010, 0030) Patient's Birth Date                DA: ''
(0010, 0040) Patient's Sex                       CS: 'F'
(0010, 1010) Patient's Age                       AS: '51'
(0018, 0015) Body Part Examined                  CS: 'CHEST'
(0018, 5101) View Position                       CS: 'PA'
(0020, 000d) Study Instance UID                  UI: 1.2.276.0.7230010.3.1.2.8323329.28530.1517874485.775525
(0020, 000e) Series Instance UID                 UI: 1.2.276.0.7230010.3.1.3.8323329.28530.1517874485.775524
(0020, 0010) Study ID                            SH: ''
(0020, 0011) Series Number                       IS: "1"
(0020, 0013) Instance Number                     IS: "1"
(0020, 0020) Patient Orientation                 CS: ''
(0028, 0002) Samples per Pixel                   US: 1
(0028, 0004) Photometric Interpretation          CS: 'MONOCHROME2'
(0028, 0010) Rows                                US: 1024
(0028, 0011) Columns                             US: 1024
(0028, 0030) Pixel Spacing                       DS: [0.14300000000000002, 0.14300000000000002]
(0028, 0100) Bits Allocated                      US: 8
(0028, 0101) Bits Stored                         US: 8
(0028, 0102) High Bit                            US: 7
(0028, 0103) Pixel Representation                US: 0
(0028, 2110) Lossy Image Compression             CS: '01'
(0028, 2114) Lossy Image Compression Method      CS: 'ISO_10918_1'
(7fe0, 0010) Pixel Data                          OB: Array of 142006 elements
In [33]:
# Helper function to get additional features from dicom images
def get_tags(data, path):
    images = os.listdir(path)
    for _, name in tqdm_notebook(enumerate(images)):
        img_path = os.path.join(path, name)
        img_data = pydicom.read_file(img_path)
        idx = (data['patientId'] == img_data.PatientID)
        data.loc[idx,'PatientSex'] = img_data.PatientSex
        data.loc[idx,'PatientAge'] = pd.to_numeric(img_data.PatientAge)
        data.loc[idx,'BodyPartExamined'] = img_data.BodyPartExamined
        data.loc[idx,'ViewPosition'] = img_data.ViewPosition
        data.loc[idx,'Modality'] = img_data.Modality
In [34]:
print('Read the training images file names and path'); print('--'*40)
images = pd.DataFrame({'path': glob(os.path.join(TRAIN_IMG_DCM, '*.dcm'))})
images['patientId'] = images['path'].map(lambda x: os.path.splitext(os.path.basename(x))[0])
print('Number of images in the training folder: {}'.format(images.shape[0]))
print('Columns in the training images dataframe: {}'.format(list(images.columns)))
assert images.shape[0] == len(list(set(train_class_df['patientId']))), 'Number of training images should be equal to the unique patientIds we have'
Read the training images file names and path
--------------------------------------------------------------------------------
Number of images in the training folder: 26684
Columns in the training images dataframe: ['path', 'patientId']
In [35]:
print('Merge path from the `images` dataframe with `train_class` dataframe'); print('--'*40)
train_class_df = train_class_df.merge(images, on = 'patientId', how = 'left')
print('Shape of the `train_class` dataframe after merge: {}'.format(train_class_df.shape))
Merge path from the `images` dataframe with `train_class` dataframe
--------------------------------------------------------------------------------
Shape of the `train_class` dataframe after merge: (30227, 8)
In [36]:
train_class_df.head()
Out[36]:
patientId x y width height Target class path
0 0004cfab-14fd-4e49-80ba-63a80b6bddd6 0.0 0.0 0.0 0.0 0 No Lung Opacity / Not Normal stage_2_train_images/0004cfab-14fd-4e49-80ba-6...
1 00313ee0-9eaa-42f4-b0ab-c148ed3241cd 0.0 0.0 0.0 0.0 0 No Lung Opacity / Not Normal stage_2_train_images/00313ee0-9eaa-42f4-b0ab-c...
2 00322d4d-1c29-4943-afc9-b6754be640eb 0.0 0.0 0.0 0.0 0 No Lung Opacity / Not Normal stage_2_train_images/00322d4d-1c29-4943-afc9-b...
3 003d8fa0-6bf1-40ed-b54c-ac657f8495c5 0.0 0.0 0.0 0.0 0 Normal stage_2_train_images/003d8fa0-6bf1-40ed-b54c-a...
4 00436515-870c-4b36-a041-de91049b9ab4 264.0 152.0 213.0 379.0 1 Lung Opacity stage_2_train_images/00436515-870c-4b36-a041-d...
In [37]:
print('Get features such as {} from training images'.format(('PatientSex', 'PatientAge', 'BodyPartExamined', 'ViewPosition', 'Modality')))
if os.path.isfile('train_feature_engineered.pkl') :
    print('File exists..So we have the data')
else:
    get_tags(train_class_df, TRAIN_IMG_DCM)
    train_class_df.to_pickle('train_feature_engineered.pkl')

print('Saving the feature engineered dataframe for future use'); print('--'*40)
Get features such as ('PatientSex', 'PatientAge', 'BodyPartExamined', 'ViewPosition', 'Modality') from training images
File exists..So we have the data
Saving the feature engineered dataframe for future use
--------------------------------------------------------------------------------
In [38]:
train_class_df = pd.read_pickle('train_feature_engineered.pkl')
train_class_df.shape
Out[38]:
(30227, 13)
In [40]:
train_class_df.head()
Out[40]:
patientId x y width height Target class path PatientSex PatientAge BodyPartExamined ViewPosition Modality
0 0004cfab-14fd-4e49-80ba-63a80b6bddd6 0.0 0.0 0.0 0.0 0 No Lung Opacity / Not Normal stage_2_train_images/0004cfab-14fd-4e49-80ba-6... F 51.0 CHEST PA CR
1 00313ee0-9eaa-42f4-b0ab-c148ed3241cd 0.0 0.0 0.0 0.0 0 No Lung Opacity / Not Normal stage_2_train_images/00313ee0-9eaa-42f4-b0ab-c... F 48.0 CHEST PA CR
2 00322d4d-1c29-4943-afc9-b6754be640eb 0.0 0.0 0.0 0.0 0 No Lung Opacity / Not Normal stage_2_train_images/00322d4d-1c29-4943-afc9-b... M 19.0 CHEST AP CR
3 003d8fa0-6bf1-40ed-b54c-ac657f8495c5 0.0 0.0 0.0 0.0 0 Normal stage_2_train_images/003d8fa0-6bf1-40ed-b54c-a... M 28.0 CHEST PA CR
4 00436515-870c-4b36-a041-de91049b9ab4 264.0 152.0 213.0 379.0 1 Lung Opacity stage_2_train_images/00436515-870c-4b36-a041-d... F 32.0 CHEST AP CR
In [41]:
print('As expected unique in `BodyPartExamined` is: {}'.format(train_class_df['BodyPartExamined'].unique()[0]))
print('Unique in `Modality` is: {}'.format(train_class_df['Modality'].unique()[0])); print('--'*40)
As expected unique in `BodyPartExamined` is: CHEST
Unique in `Modality` is: CR
--------------------------------------------------------------------------------
In [42]:
print('Overall the distribution is almost equal for `ViewPosition` but where there\'s a Pneumonia Evidence, `ViewPosition` is `AP`')
print('AP: Anterior/Posterior, PA: Posterior/Anterior'); print('--'*40)
fig = plt.figure(figsize = (10, 6))
ax = fig.add_subplot(121)
g = (train_class_df['ViewPosition'].value_counts()
    .plot(kind = 'pie', autopct = '%.0f%%',  
          startangle = 90,
          title = 'Distribution of ViewPosition, Overall', 
          fontsize = 12)
    .set_ylabel(''))
ax = fig.add_subplot(122)
g = (train_class_df.loc[train_class_df['Target'] == 1, 'ViewPosition']
     .value_counts().sort_index(ascending = False)
    .plot(kind = 'pie', autopct = '%.0f%%', 
          startangle = 90, counterclock = False, 
          title = 'Distribution of ViewPosition, Pneumonia Evidence', 
          fontsize = 12)
    .set_ylabel(''))
Overall the distribution is almost equal for `ViewPosition` but where there's a Pneumonia Evidence, `ViewPosition` is `AP`
AP: Anterior/Posterior, PA: Posterior/Anterior
--------------------------------------------------------------------------------
In [43]:
print('Plot x and y centers of bounding boxes'); print('--'*40)
# Creating a dataframe with columns for center of the rectangles
bboxes = train_class_df[train_class_df['Target'] == 1]
bboxes['xw'] = bboxes['x'] + bboxes['width'] / 2
bboxes['yh'] = bboxes['y'] + bboxes['height'] / 2

g = sns.jointplot(x = bboxes['xw'], y = bboxes['yh'], data = bboxes, 
                  kind = 'hex', alpha = 0.5, size = 8)
plt.suptitle('Bounding Boxes Location, Pneumonia Evidence')
plt.tight_layout()
plt.subplots_adjust(top = 0.95)
plt.show()
Plot x and y centers of bounding boxes
--------------------------------------------------------------------------------
In [44]:
# Helper function to plot bboxes scatter
# Reference for this function & plots: https://www.kaggle.com/gpreda/rsna-pneumonia-detection-eda
def bboxes_scatter(df1, df2, text1, text2):
    fig, (ax1, ax2) = plt.subplots(1, 2, figsize = (13, 8))
    fig.subplots_adjust(top = 0.85)
    fig.suptitle('Plotting centers of lung opacity\n{} & {}'.format(text1, text2))
    df1.plot.scatter(x = 'xw', y = 'yh', ax = ax1, alpha = 0.8, marker = '.', 
                   xlim = (0, 1024), ylim = (0, 1024), color = 'green')
    ax1.set_title('Centers of Lung Opacity\n{}'.format(text1))
    for i, row in df1.iterrows():
        ax1.add_patch(Rectangle(xy = (row['x'], row['y']),
                            width = row['width'], height = row['height'], 
                            alpha = 3.5e-3, color = 'yellow'))
    plt.title('Centers of Lung Opacity\n{}'.format(text2))
    df2.plot.scatter(x = 'xw', y = 'yh', ax = ax2, alpha = 0.8, marker = '.',
                   color = 'brown',  xlim = (0, 1024), ylim = (0, 1024))
    ax2.set_title('Centers of Lung Opacity\n{}'.format(text2))
    for i, row in df2.iterrows():
        ax2.add_patch(Rectangle(xy = (row['x'], row['y']),
                             width = row['width'], height = row['height'],
                             alpha = 3.5e-3, 
                             color = 'yellow'))
    plt.show()
In [45]:
print('Exploring the bounding boxes centers for `ViewPositions` for random sample = 1000')

df1 = bboxes[bboxes['ViewPosition'] == 'PA'].sample(1000)
df2 = bboxes[bboxes['ViewPosition'] == 'AP'].sample(1000)
bboxes_scatter(df1, df2, 'View Position = PA', 'View Position = AP')
Exploring the bounding boxes centers for `ViewPositions` for random sample = 1000

Observations: BodyPartExamined & ViewPosition Above we saw,

BodyPartExamined is unique for all cases and is CHEST in the training dataset and that was also expected. Unique in Modality is CR i.e. Computer Radiography Overall ViewPosition is almost equally distributed in the training dataset but for cases where Target=1, most of the view position are AP.

In [46]:
print('Checking outliers in `PatientAge'); print('--'*40)
print('Minimum `PatientAge` in the training dataset: {}'.format(train_class_df['PatientAge'].min()))
print('Maximum `PatientAge` in the training dataset: {}'.format(train_class_df['PatientAge'].max()))
print('75th Percentile of `PatientAge` in the training dataset: {}'.format(train_class_df['PatientAge'].quantile(0.75)))
print('`PatientAge` in upper whisker for box plot: {}'.format(train_class_df['PatientAge'].quantile(0.75) + (train_class_df['PatientAge'].quantile(0.75) - train_class_df['PatientAge'].quantile(0.25))))
print()
fig = plt.figure(figsize = (10, 6))
ax = sns.boxplot(data = train_class_df['PatientAge'], orient = 'h').set_title('Outliers in PatientAge')
Checking outliers in `PatientAge
--------------------------------------------------------------------------------
Minimum `PatientAge` in the training dataset: 1.0
Maximum `PatientAge` in the training dataset: 155.0
75th Percentile of `PatientAge` in the training dataset: 59.0
`PatientAge` in upper whisker for box plot: 84.0

In [47]:
print('Using pd.clip to set upper threshold of 100 for age and remove outliers'); print('--'*40)
train_class_df['PatientAge'] = train_class_df['PatientAge'].clip(train_class_df['PatientAge'].min(), 100)
train_class_df['PatientAge'].describe().astype(int)
Using pd.clip to set upper threshold of 100 for age and remove outliers
--------------------------------------------------------------------------------
Out[47]:
count    30227
mean        46
std         16
min          1
25%         34
50%         49
75%         59
max        100
Name: PatientAge, dtype: int64
In [48]:
print('Get the distribution of `PatientAge` overall and where Target = 1'); print('--'*40)
fig = plt.figure(figsize = (10, 6))
ax = fig.add_subplot(121)
g = (sns.distplot(train_class_df['PatientAge'])
    .set_title('Distribution of PatientAge, Overall'))
ax = fig.add_subplot(122)
g = (sns.distplot(train_class_df.loc[train_class_df['Target'] == 1, 'PatientAge'])
    .set_title('Distribution of PatientAge, Pneumonia Evidence'))
Get the distribution of `PatientAge` overall and where Target = 1
--------------------------------------------------------------------------------

Using Binning Method for PatientAge feature

We'll make use of a pd.cut which is 'Bin values into discrete intervals'. Use of this method is recommended when need is to segment and sort data values into bins. This function is also useful for going from a continuous variable to a categorical variable. Supports binning into an equal number of bins, or a pre-specified array of bins.

In [49]:
print('Creating Age Binning field', '--'*40)
train_class_df['AgeBins'] = pd.cut(train_class_df['PatientAge'], bins = 4, precision = 0, labels = ['<=26', '<=50', '<=75', '<=100'])
train_class_df['AgeBins'].value_counts()
Creating Age Binning field --------------------------------------------------------------------------------
Out[49]:
<=75     13318
<=50     12157
<=26      3972
<=100      780
Name: AgeBins, dtype: int64
In [50]:
print('Value counts of the age bin field created'); print('--'*40)
display(pd.concat([train_class_df['AgeBins'].value_counts().sort_index().rename('Counts of Age Bins, Overall'), 
                   train_class_df.loc[train_class_df['Target'] == 1, 'AgeBins'].value_counts().sort_index().rename('Counts of Age Bins, Target=1')], axis = 1))
print()
f, (ax1, ax2) = plt.subplots(1, 2, figsize = (10, 6))
g = sns.countplot(x = train_class_df['AgeBins'], ax = ax1).set_title('Count Plot of Age Bins, Overall')
g = sns.countplot(x = train_class_df.loc[train_class_df['Target'] == 1, 'AgeBins'], ax = ax2).set_title('Count Plot of Age Bins, Pneumonia Evidence')
plt.tight_layout()
Value counts of the age bin field created
--------------------------------------------------------------------------------
Counts of Age Bins, Overall Counts of Age Bins, Target=1
<=26 3972 1478
<=50 12157 3917
<=75 13318 3895
<=100 780 265

In [51]:
print('Exploring the bounding boxes centers for `AgeBins` for random sample = 200')
# Creating a dataframe with columns for center of the rectangles
bboxes = train_class_df[train_class_df['Target'] == 1]
bboxes['xw'] = bboxes['x'] + bboxes['width'] / 2
bboxes['yh'] = bboxes['y'] + bboxes['height'] / 2

df1 = bboxes[bboxes['AgeBins'] == '<=26'].sample(200)
df2 = bboxes[bboxes['AgeBins'] == '<=100'].sample(200)
bboxes_scatter(df1, df2, '1 < AgeBins < 26 (Lower Bin)', '76 < AgeBins < 100 (Upper Bin)')
Exploring the bounding boxes centers for `AgeBins` for random sample = 200
In [52]:
print('Checking distribution of age for those with Pneumonia Evidence, by Gender & Count Plot of Gender'); print('--'*40)
display(pd.concat([train_class_df['PatientSex'].value_counts(normalize = True).round(2).sort_values().rename('% Gender, Overall'), 
                   train_class_df.loc[(train_class_df['Target'] == 1), 'PatientSex']
                   .value_counts(normalize = True).round(2).sort_index().rename('% Gender, Target=1')], axis = 1))

f, ((ax1, ax2), (ax3, ax4)) = plt.subplots(2, 2, figsize = (10, 10))
g = sns.distplot(train_class_df.loc[(train_class_df['Target'] == 1) & (train_class_df['PatientSex'] == 'M'), 'PatientAge'], ax = ax1).set_title('Distribution of Age for Male, Pneumonia Evidence')
g = sns.distplot(train_class_df.loc[(train_class_df['Target'] == 1) & (train_class_df['PatientSex'] == 'F'), 'PatientAge'], ax = ax2).set_title('Distribution of Age for Female, Pneumonia Evidence')
g = sns.countplot(y = train_class_df['PatientSex'], ax = ax3, palette = 'PuOr').set_title('Count Plot of Gender, Overall')
g = sns.countplot(y = train_class_df.loc[(train_class_df['Target'] == 1), 'PatientSex'], ax = ax4, palette = 'PuOr').set_title('Count Plot of Gender, Pneumonia Evidence')
plt.tight_layout()
Checking distribution of age for those with Pneumonia Evidence, by Gender & Count Plot of Gender
--------------------------------------------------------------------------------
% Gender, Overall % Gender, Target=1
F 0.43 0.42
M 0.57 0.58
In [53]:
print('Exploring the bounding boxes centers for `PatientSex` for random sample = 1000')
df1 = bboxes[bboxes['PatientSex'] == 'M'].sample(1000)
df2 = bboxes[bboxes['PatientSex'] == 'F'].sample(1000)
bboxes_scatter(df1, df2, 'PatientSex = M', 'PatientSex = F')
Exploring the bounding boxes centers for `PatientSex` for random sample = 1000

Observations: PatientAge & PatientSex Above we saw,

For PatientAge we saw the distribution for both overall and where there were evidence of Pneumonia. Used binning to check the count of age bins. Count was highest for age group 40-78 both overall and with Pneumonia Evidence. Saw distribution of age for Male and Female with Pneumonia Evidence. Dataset had more Males (57%-58%) than Females (42%-43%). Only PatientAge, PatientSex and ViewPosition are useful features from metadata.

Dropping the other features from train_class dataframe and save that as a pickle file

In [54]:
train_class_df.drop(['BodyPartExamined', 'Modality', 'AgeBins'], inplace = True, axis = 1)
train_class_df.to_pickle('train_class_features.pkl')
display(train_class_df.shape, train_class_df.head())
(30227, 11)
patientId x y width height Target class path PatientSex PatientAge ViewPosition
0 0004cfab-14fd-4e49-80ba-63a80b6bddd6 0.0 0.0 0.0 0.0 0 No Lung Opacity / Not Normal stage_2_train_images/0004cfab-14fd-4e49-80ba-6... F 51.0 PA
1 00313ee0-9eaa-42f4-b0ab-c148ed3241cd 0.0 0.0 0.0 0.0 0 No Lung Opacity / Not Normal stage_2_train_images/00313ee0-9eaa-42f4-b0ab-c... F 48.0 PA
2 00322d4d-1c29-4943-afc9-b6754be640eb 0.0 0.0 0.0 0.0 0 No Lung Opacity / Not Normal stage_2_train_images/00322d4d-1c29-4943-afc9-b... M 19.0 AP
3 003d8fa0-6bf1-40ed-b54c-ac657f8495c5 0.0 0.0 0.0 0.0 0 Normal stage_2_train_images/003d8fa0-6bf1-40ed-b54c-a... M 28.0 PA
4 00436515-870c-4b36-a041-de91049b9ab4 264.0 152.0 213.0 379.0 1 Lung Opacity stage_2_train_images/00436515-870c-4b36-a041-d... F 32.0 AP
In [55]:
print('Checking sample for different classes')
sample1 = train_class_df.loc[train_class_df['class'] == 'Normal'].iloc[0]
sample2 = train_class_df.loc[train_class_df['class'] == 'No Lung Opacity / Not Normal'].iloc[0]
sample3 = train_class_df.loc[train_class_df['class'] == 'Lung Opacity'].iloc[1]
ds1 = pydicom.dcmread(sample1['path'])
ds2 = pydicom.dcmread(sample2['path'])
ds3 = pydicom.dcmread(sample3['path'])

f, ((ax1, ax2, ax3)) = plt.subplots(1, 3, figsize = (15, 8))
ax1.imshow(ds1.pixel_array, cmap = plt.cm.bone)
ax1.set_title('Class = Normal')
ax1.axis('off')
ax2.imshow(ds2.pixel_array, cmap = plt.cm.bone)
ax2.set_title('Class = No Lung Opacity / Not Normal')
ax2.axis('off')
ax3.imshow(ds3.pixel_array, cmap = plt.cm.bone)
ax3.set_title('Class = Lung Opacity')
ax3.axis('off')
plt.show()
Checking sample for different classes
In [56]:
sample4 = train_class_df.loc[(train_class_df['ViewPosition'] == 'AP')].iloc[0]
sample5 = train_class_df.loc[(train_class_df['ViewPosition'] == 'PA')].iloc[0]
ds4 = pydicom.dcmread(sample4['path'])
ds5 = pydicom.dcmread(sample5['path'])

f, ((ax1, ax2)) = plt.subplots(1, 2, figsize = (15, 8))
ax1.imshow(ds4.pixel_array, cmap = plt.cm.bone)
ax1.set_title('View Position = AP')
ax1.axis('off')
ax2.imshow(ds5.pixel_array, cmap = plt.cm.bone)
ax2.set_title('View Position = PA')
ax2.axis('off')
plt.show()
In [57]:
# Helper function to plot the dicom images
def plot_dicom_images(data, df, img_path):
    img_data = list(data.T.to_dict().values())
    #print(img_data)
    f, ax = plt.subplots(3, 3, figsize = (16, 18))
    for i, row in enumerate(img_data):
        image = row['patientId'] + '.dcm'
        #print(image)
        path = os.path.join(img_path, image)
        data = pydicom.read_file(path)
        rows = df[df['patientId'] == row['patientId']]
        age = rows.PatientAge.unique().tolist()[0]
        sex = data.PatientSex
        part = data.BodyPartExamined
        vp = data.ViewPosition
        modality = data.Modality
        data_img = pydicom.dcmread(path)
        #print(image)
        ax[i//3, i%3].imshow(data_img.pixel_array, cmap = plt.cm.bone)
        ax[i//3, i%3].axis('off')
        ax[i//3, i%3].set_title('ID: {}\nAge: {}, Sex: {}, Part: {}, VP: {}, Modality: {}\nTarget: {}, Class: {}\nWindow: {}:{}:{}:{}'\
                              .format(row['patientId'], age, sex, part, 
                                      vp, modality, row['Target'], 
                                      row['class'], row['x'], 
                                      row['y'], row['width'],
                                      row['height']))
        box_data = list(rows.T.to_dict().values())
        for j, row in enumerate(box_data):
            ax[i//3, i%3].add_patch(Rectangle(xy = (row['x'], row['y']),
                      width = row['width'], height = row['height'], 
                      color = 'red', alpha = 0.15)) 
    plt.show()
In [58]:
# this function is a part of custom module imported earlier (`eda`)
plot_dicom_images(data = train_class_df.loc[(train_class_df['Target'] == 1)].sample(9), 
                  df = train_class_df, img_path = TRAIN_IMG_DCM)

Now we will convert the images from dcm to jpg for faster processing of data

In [59]:
def convertImage(folder_path, jpg_folder_path):
    if not os.path.exists(jpg_folder_path): 
        os.makedirs(jpg_folder_path)
    images_path = os.listdir(folder_path)
    for n, image in tqdm_notebook(enumerate(images_path)):
        ds = pydicom.dcmread(os.path.join(folder_path, image))
        pixel_array_numpy = ds.pixel_array
        image = image.replace('.dcm', '.jpg')
        cv2.imwrite(os.path.join(jpg_folder_path, image), pixel_array_numpy)

Convert all training images

In [60]:
if os.listdir(TRAIN_IMG_DIR_JPG) == []: 
    print("No files found in the directory.") 
    convertImage(TRAIN_IMG_DCM, TRAIN_IMG_DIR_JPG)

Convert all test images

In [61]:
if os.listdir(TRAIN_IMG_DIR_JPG) == []: 
    print("No files found in the directory.") 
    convertImage(TEST_IMG_DCM, TEST_IMG_DIR_JPG)

Read data and Preparation for Model

In [62]:
ALPHA = 1
IMAGE_SIZE = 1024
IMAGE_HEIGHT = 224
IMAGE_WIDTH = 224
In [63]:
train_class_df.head(5)
Out[63]:
patientId x y width height Target class path PatientSex PatientAge ViewPosition
0 0004cfab-14fd-4e49-80ba-63a80b6bddd6 0.0 0.0 0.0 0.0 0 No Lung Opacity / Not Normal stage_2_train_images/0004cfab-14fd-4e49-80ba-6... F 51.0 PA
1 00313ee0-9eaa-42f4-b0ab-c148ed3241cd 0.0 0.0 0.0 0.0 0 No Lung Opacity / Not Normal stage_2_train_images/00313ee0-9eaa-42f4-b0ab-c... F 48.0 PA
2 00322d4d-1c29-4943-afc9-b6754be640eb 0.0 0.0 0.0 0.0 0 No Lung Opacity / Not Normal stage_2_train_images/00322d4d-1c29-4943-afc9-b... M 19.0 AP
3 003d8fa0-6bf1-40ed-b54c-ac657f8495c5 0.0 0.0 0.0 0.0 0 Normal stage_2_train_images/003d8fa0-6bf1-40ed-b54c-a... M 28.0 PA
4 00436515-870c-4b36-a041-de91049b9ab4 264.0 152.0 213.0 379.0 1 Lung Opacity stage_2_train_images/00436515-870c-4b36-a041-d... F 32.0 AP
In [64]:
train_class_df['x2']=train_class_df['x'] + train_class_df['width']
train_class_df['y2']=train_class_df['y'] + train_class_df['height']
train_class_df.rename(columns = {'x':'x1'}, inplace = True)
train_class_df.rename(columns = {'y':'y1'}, inplace = True)
In [65]:
def dropFeatures(train_class_updt_df):
    train_class_reduced_df = train_class_updt_df[['path', 'x1', 'y1','x2','y2','Target']].copy(deep = True)
    train_class_reduced_df['path'] = (train_class_reduced_df['path']
                                 .str.replace('stage_2_train_images', 'JPG_train')
                                 .str.replace('.dcm', '.jpg'))
    print('Distribution of target in the training set:'); 
    display(pd.Series(train_class_reduced_df['Target']).value_counts())
    train_class_reduced_df = train_class_reduced_df.reset_index()
    return train_class_reduced_df
In [66]:
train_class_updt_df = dropFeatures(train_class_df)
Distribution of target in the training set:
0    20672
1     9555
Name: Target, dtype: int64
In [67]:
train_class_updt_df.head()
Out[67]:
index path x1 y1 x2 y2 Target
0 0 JPG_train/0004cfab-14fd-4e49-80ba-63a80b6bddd6... 0.0 0.0 0.0 0.0 0
1 1 JPG_train/00313ee0-9eaa-42f4-b0ab-c148ed3241cd... 0.0 0.0 0.0 0.0 0
2 2 JPG_train/00322d4d-1c29-4943-afc9-b6754be640eb... 0.0 0.0 0.0 0.0 0
3 3 JPG_train/003d8fa0-6bf1-40ed-b54c-ac657f8495c5... 0.0 0.0 0.0 0.0 0
4 4 JPG_train/00436515-870c-4b36-a041-de91049b9ab4... 264.0 152.0 477.0 531.0 1
In [68]:
print('Distribution of target in the original set:'); 
display(pd.Series(train_class_updt_df['Target']).value_counts())
Distribution of target in the original set:
0    20672
1     9555
Name: Target, dtype: int64
In [69]:
def load_image(path):
    img = cv2.imread(path, 1)
    # OpenCV loads images with color channels
    # in BGR order. So we need to reverse them
    return img[...,::-1]
In [70]:
'''
Draw the bounding box on image
@param image - the image 
@param tl - bounding box top left corner
@param br - bounding box top bottom right
@return: a copy of input image with the bounding box on it
'''
def draw_boundingbox( image, tl, br ):
    copied_image = image.copy()
    cv2.imwrite("preview/test.jpeg",copied_image)
    image = cv2.rectangle(copied_image, tl, br, (0, 0, 255), 3)          
    return copied_image

'''Show image'''
def show_image( image ):
    # Drawing picture
    #plt.figure( figsize = (15,15) )
    plt.axis('off')
    plt.imshow( image )
    plt.show()

'''Plot all images in a list'''
def plot_images( images_list ):
    num_cols = 5
    num_rows = math.ceil( len(images_list) / num_cols )
    figsize = (18,8)
    fig = plt.figure( figsize=figsize)

    for i in range (0, len(images_list) ):
        info = images_list[i]
        x1, y1, x2, y2 = info[1]
        image = draw_boundingbox( info[0], (x1, y1), (x2, y2) )
        
        axi = fig.add_subplot(num_rows, num_cols, i+1)
        axi.axis( 'off' )
        axi.set_title( ("%.1f" % info[2] ) )
        axi.imshow( image )
        
def saveAugImage(images_list, df_augImage, origImgName, i):
    for i in range (0, len(images_list) ):
        info = images_list[i]
        imgName = TRAIN_IMG_AUG_JPG + "/" +"aug_" + str(i+1) + "_" + origImgName
        cv2.imwrite(imgName, info[0].copy())
        x1, y1, x2, y2 = info[1]
        df_augImage = df_augImage.append({'path':imgName, 'x1':x1, 'y1':y1, 'x2':x2, 'y2':y2, 'Target':1},
                                         ignore_index=True)
    return df_augImage 
In [71]:
def show_corner_bb(im, df_image, i):
    plt.imshow(im)
    plt.gca().add_patch(create_corner_rect(df_image, i))
In [72]:
def create_corner_rect(df_image, i, color='red'):
    return plt.Rectangle((df_image['x1'].iloc[i], df_image['y1'].iloc[i]), df_image['x2'].iloc[i]-df_image['x1'].iloc[i], df_image['y2'].iloc[i]-df_image['y1'].iloc[i], color=color,
                         fill=False, lw=3)
In [73]:
countImbalance = train_class_updt_df[train_class_updt_df.Target==0].shape[0] - train_class_updt_df[train_class_updt_df.Target==1].shape[0]
In [74]:
print('Number of additional images with pneumonia', countImbalance); 
Number of additional images with pneumonia 11117
In [75]:
df_minority = train_class_updt_df[train_class_updt_df.Target==1]
df_minority = df_minority.reset_index()
In [76]:
df_majority = train_class_updt_df[train_class_updt_df.Target==0]
df_majority = df_majority.reset_index()
In [77]:
df_minority.shape, df_majority.shape
Out[77]:
((9555, 8), (20672, 8))
In [78]:
df_minority.head()
Out[78]:
level_0 index path x1 y1 x2 y2 Target
0 4 4 JPG_train/00436515-870c-4b36-a041-de91049b9ab4... 264.0 152.0 477.0 531.0 1
1 5 5 JPG_train/00436515-870c-4b36-a041-de91049b9ab4... 562.0 152.0 818.0 605.0 1
2 8 8 JPG_train/00704310-78a8-4b38-8475-49f4573b2dbb... 323.0 577.0 483.0 681.0 1
3 9 9 JPG_train/00704310-78a8-4b38-8475-49f4573b2dbb... 695.0 575.0 857.0 712.0 1
4 14 14 JPG_train/00aecb01-a116-45a2-956c-08d2fa55433f... 288.0 322.0 382.0 457.0 1
In [79]:
df_majority.head()
Out[79]:
level_0 index path x1 y1 x2 y2 Target
0 0 0 JPG_train/0004cfab-14fd-4e49-80ba-63a80b6bddd6... 0.0 0.0 0.0 0.0 0
1 1 1 JPG_train/00313ee0-9eaa-42f4-b0ab-c148ed3241cd... 0.0 0.0 0.0 0.0 0
2 2 2 JPG_train/00322d4d-1c29-4943-afc9-b6754be640eb... 0.0 0.0 0.0 0.0 0
3 3 3 JPG_train/003d8fa0-6bf1-40ed-b54c-ac657f8495c5... 0.0 0.0 0.0 0.0 0
4 6 6 JPG_train/00569f44-917d-4c86-a842-81832af98c30... 0.0 0.0 0.0 0.0 0
In [80]:
def rotate_image( image, angle, bounding_box ):
    # get image dimension
    img_height, img_width = image.shape[:2]
    # get rotation matrix
    rotation_matrix = cv2.getRotationMatrix2D( center = (img_width // 2, img_height // 2), angle = angle, scale = 1.0 )
    # apply transformation (ratate image) 
    rotated_image = cv2.warpAffine( image, rotation_matrix, (img_width, img_height) )
    # --- compute new bounding box ---
    # Apply same transformation to the four bounding box corners
    rotated_point_A = np.matmul( rotation_matrix, np.array( [bounding_box[0], bounding_box[1], 1] ).T )   
    rotated_point_B = np.matmul( rotation_matrix, np.array( [bounding_box[2], bounding_box[1], 1] ).T )   
    rotated_point_C = np.matmul( rotation_matrix, np.array( [bounding_box[2], bounding_box[3], 1] ).T )   
    rotated_point_D = np.matmul( rotation_matrix, np.array( [bounding_box[0], bounding_box[3], 1] ).T )   
    # Compute new bounding box, that is, the bounding box for rotated object
    x = np.array( [ rotated_point_A[0], rotated_point_B[0], rotated_point_C[0], rotated_point_D[0] ] )
    y = np.array( [ rotated_point_A[1], rotated_point_B[1], rotated_point_C[1], rotated_point_D[1] ] )
    new_boundingbox = [np.min( x ).astype(int), np.min( y ).astype(int), np.max( x ).astype(int), np.max( y ).astype(int)]
    return rotated_image, new_boundingbox
In [81]:
t_dic = { "rotation":rotate_image}
f_dic = { "rotation":(0, 90)}
import random
def apply_transformation( image, bounding_box, transformation, n ):
    t_images_list = []
    for i in range(0, n):
        interval = f_dic[transformation]
        factor = random.uniform(interval[0], interval[1])
        img, bb = t_dic[transformation]( image, factor, bounding_box )
        t_images_list.append( (img, bb, factor) )
    return t_images_list
In [82]:
df_augImg = pd.DataFrame(columns=['path','x1','y1','x2','y2','Target'])

if os.path.isdir(TRAIN_IMG_AUG_JPG):
    if os.listdir(TRAIN_IMG_AUG_JPG) != []: 
        shutil.rmtree(TRAIN_IMG_AUG_JPG)
        os.mkdir(TRAIN_IMG_AUG_JPG)
else:
    os.mkdir(TRAIN_IMG_AUG_JPG)
In [83]:
for i in tqdm_notebook(range(0, 5558)):
    imgPath = df_minority['path'].iloc[i]
    imgName = imgPath.split("/", 1)[1]
    #print(imgName)
    original_boundingbox = [df_minority['x1'].iloc[i], df_minority['y1'].iloc[i], 
                            df_minority['x2'].iloc[i], df_minority['y2'].iloc[i]]
    rotated_images = apply_transformation(load_image(df_minority['path'].iloc[i]), 
                                          original_boundingbox, "rotation", 2 )
    df_augImg = saveAugImage(rotated_images, df_augImg, imgName, i)

In [84]:
df_minority_updt = df_minority[['path', 'x1', 'y1','x2','y2','Target']].copy(deep = True)
#df_minority_updt['path'] = (df_minority_updt['path']
#                                 .str.replace(TRAIN_IMG_DIR_JPG, TRAIN_IMG_AUG_JPG))
In [85]:
df_minority_updt.shape
Out[85]:
(9555, 6)
In [86]:
df_majority_updt = df_majority[['path', 'x1', 'y1','x2','y2','Target']].copy(deep = True)
#df_majority_updt['path'] = (df_majority_updt['path']
#                                 .str.replace(TRAIN_IMG_DIR_JPG, TRAIN_IMG_AUG_JPG))
In [87]:
df_majority_updt.shape
Out[87]:
(20672, 6)
In [88]:
df_train_class_merged = pd.concat([df_minority_updt, df_majority_updt, df_augImg], ignore_index=True)
In [89]:
df_train_class_merged.shape
Out[89]:
(41343, 6)
In [90]:
print('Distribution of target in the updated set:'); 
display(pd.Series(df_train_class_merged['Target']).value_counts())
Distribution of target in the updated set:
0    20672
1    20671
Name: Target, dtype: int64
In [91]:
df_augImg.head(4)
Out[91]:
path x1 y1 x2 y2 Target
0 JPG_train_aug/aug_1_00436515-870c-4b36-a041-de... 162 181 484 608 1
1 JPG_train_aug/aug_2_00436515-870c-4b36-a041-de... 83 277 500 698 1
2 JPG_train_aug/aug_1_00436515-870c-4b36-a041-de... 285 42 790 539 1
3 JPG_train_aug/aug_2_00436515-870c-4b36-a041-de... 213 76 732 509 1
In [92]:
def show_corner_bb(im, df_image, i):
    plt.imshow(im)
    plt.gca().add_patch(create_corner_rect(df_image, i))
In [93]:
def create_corner_rect(df_image, i, color='red'):
    return plt.Rectangle((df_image['x1'].iloc[i], df_image['y1'].iloc[i]), df_image['x2'].iloc[i]-df_image['x1'].iloc[i], df_image['y2'].iloc[i]-df_image['y1'].iloc[i], color=color,
                         fill=False, lw=3)
In [94]:
#original
i = 1
im = cv2.imread(str(df_augImg['path'].iloc[i]))
im = cv2.cvtColor(im, cv2.COLOR_BGR2RGB)
show_corner_bb(im, df_augImg, i)
In [ ]:
#original
i = 10
im = cv2.imread(str(df_augImg['path'].iloc[i]))
im = cv2.cvtColor(im, cv2.COLOR_BGR2RGB)
show_corner_bb(im, df_augImg, i)
In [95]:
train_class_reduced_df = df_train_class_merged.sample(frac = nbrImages) 
In [96]:
train_class_reduced_df.shape
Out[96]:
(41343, 6)
In [97]:
train_class_reduced_df.head()
Out[97]:
path x1 y1 x2 y2 Target
17964 JPG_train/7dd70f51-5d1b-49a8-9947-aabd95bcfb99... 0 0 0 0 0
31601 JPG_train_aug/aug_1_17398977-1b0f-4de4-9038-9c... 510 114 1121 693 1
29705 JPG_train/0ea1d0f4-e828-487f-9f29-06e094018054... 0 0 0 0 0
20737 JPG_train/9c52f20d-a5ea-45b2-a289-e6b7d2b614e5... 0 0 0 0 0
34763 JPG_train_aug/aug_1_3fcc1a52-4329-4c61-9a79-f6... 319 544 532 751 1
In [98]:
print('Distribution of target in the updated set:'); 
display(pd.Series(train_class_reduced_df['Target']).value_counts())
print('Distribution of target in the merged set:'); 
display(pd.Series(df_train_class_merged['Target']).value_counts())
Distribution of target in the updated set:
0    20672
1    20671
Name: Target, dtype: int64
Distribution of target in the merged set:
0    20672
1    20671
Name: Target, dtype: int64
In [99]:
df_train_class_merged.head()
Out[99]:
path x1 y1 x2 y2 Target
0 JPG_train/00436515-870c-4b36-a041-de91049b9ab4... 264 152 477 531 1
1 JPG_train/00436515-870c-4b36-a041-de91049b9ab4... 562 152 818 605 1
2 JPG_train/00704310-78a8-4b38-8475-49f4573b2dbb... 323 577 483 681 1
3 JPG_train/00704310-78a8-4b38-8475-49f4573b2dbb... 695 575 857 712 1
4 JPG_train/00aecb01-a116-45a2-956c-08d2fa55433f... 288 322 382 457 1
In [100]:
train_class_reduced_df.head()
Out[100]:
path x1 y1 x2 y2 Target
17964 JPG_train/7dd70f51-5d1b-49a8-9947-aabd95bcfb99... 0 0 0 0 0
31601 JPG_train_aug/aug_1_17398977-1b0f-4de4-9038-9c... 510 114 1121 693 1
29705 JPG_train/0ea1d0f4-e828-487f-9f29-06e094018054... 0 0 0 0 0
20737 JPG_train/9c52f20d-a5ea-45b2-a289-e6b7d2b614e5... 0 0 0 0 0
34763 JPG_train_aug/aug_1_3fcc1a52-4329-4c61-9a79-f6... 319 544 532 751 1
In [101]:
train_class_updt_df = dropFeatures(train_class_reduced_df)
Distribution of target in the training set:
0    20672
1    20671
Name: Target, dtype: int64

We need to handle imbalance hence we are going to delete non pneumonic person from the dataset

In [ ]:
"""
def balanceSample(train_class_updt_df):
    df_majority = train_class_updt_df[train_class_updt_df.Target==0]
    df_minority = train_class_updt_df[train_class_updt_df.Target==1]
    # Downsample majority class
    df_majority_downsampled = resample(df_majority, 
                                     replace=False,    # sample without replacement
                                     n_samples=df_minority.shape[0],     # to match minority class
                                     random_state=0) # reproducible results

    # Combine minority class with downsampled majority class
    df_majority_downsampled = pd.concat([df_majority_downsampled, df_minority])

    # Display new class counts
    display(df_majority_downsampled.Target.value_counts())
    df_majority_downsampled = df_majority_downsampled.reset_index()
    return df_majority_downsampled
"""
Out[ ]:
'\ndef balanceSample(train_class_updt_df):\n    df_majority = train_class_updt_df[train_class_updt_df.Target==0]\n    df_minority = train_class_updt_df[train_class_updt_df.Target==1]\n    # Downsample majority class\n    df_majority_downsampled = resample(df_majority, \n                                     replace=False,    # sample without replacement\n                                     n_samples=df_minority.shape[0],     # to match minority class\n                                     random_state=0) # reproducible results\n\n    # Combine minority class with downsampled majority class\n    df_majority_downsampled = pd.concat([df_majority_downsampled, df_minority])\n\n    # Display new class counts\n    display(df_majority_downsampled.Target.value_counts())\n    df_majority_downsampled = df_majority_downsampled.reset_index()\n    return df_majority_downsampled\n'
In [ ]:
train_class_updt_df.head()
Out[ ]:
index path x1 y1 x2 y2 Target
0 7975 JPG_train/d6ea0603-0633-4cc8-8d36-f523370de74f... 664 220 958 653 1
1 25497 JPG_train/d70f60a3-4a94-43b0-aaa0-21d4f5214059... 0 0 0 0 0
2 12465 JPG_train/45eecdb1-b5d2-48e8-aed2-c24dd02f803b... 0 0 0 0 0
3 29034 JPG_train/fabcaf4a-8724-4d33-848a-541457221d89... 0 0 0 0 0
4 38441 JPG_train_aug/aug_1_849217ad-5cb2-4a87-9796-42... 476 123 797 390 1
In [ ]:
train_class_updt_df.shape
Out[ ]:
(41343, 7)
In [ ]:
def maskImage(train_class_updt_df):
    masks = np.zeros((int(train_class_updt_df.shape[0]), IMAGE_HEIGHT, IMAGE_WIDTH))
    X = np.zeros((int(train_class_updt_df.shape[0]), IMAGE_HEIGHT, IMAGE_WIDTH, 3))
    for index in tqdm_notebook(range(train_class_updt_df.shape[0])):
        img = load_image(train_class_updt_df['path'][index])
        img = cv2.resize(img, dsize=(IMAGE_HEIGHT, IMAGE_WIDTH), interpolation=cv2.INTER_CUBIC)
        try:
          img = img[:, :, :3]
        except:
          continue
        x1 = train_class_updt_df['x1'][index]   
        y1 = train_class_updt_df['y1'][index]
        x2 = train_class_updt_df['x2'][index]
        y2 = train_class_updt_df['y2'][index]
        X[index] = preprocess_input(np.array(img, dtype=np.float32))
        
        x1 = int((x1 * IMAGE_WIDTH) / IMAGE_SIZE) 
        x2 = int((x2 * IMAGE_WIDTH) / IMAGE_SIZE)   
        y1 = int((y1 * IMAGE_HEIGHT) / IMAGE_SIZE) 
        y2 = int((y2 * IMAGE_HEIGHT) / IMAGE_SIZE) 
        masks[index][y1:y2, x1:x2] = 1
    return X, masks
In [ ]:
X, masks = maskImage(train_class_updt_df[0:1000])
In [ ]:
x=len(train_class_updt_df)-5
y=5
m=5
for i in range(1,42,1):
  y=y+1000
  x=x-1000
  df=train_class_updt_df[m:y]
  X1, masks1 = maskImage(df)
  X=np.append(X,X1)
  masks=np.append(masks,masks1)
  m=y
  if(x<1000):
    df=train_class_updt_df[m:y]
    X1, masks1 = maskImage(df)
    X=np.append(X,X1)
    masks=np.append(masks,masks1)


In [ ]:
X2, masks2 = maskImage(train_class_updt_df[0:5])

In [ ]:
masks.shape, X.shape
Out[ ]:
((827, 224, 224), (827, 224, 224, 3))
In [ ]:
def viewImage(n, X, y):
    f, ((ax1, ax2)) = plt.subplots(1, 2, figsize = (15, 8))
    ax1.imshow(X[n], cmap = plt.cm.bone)
    ax1.set_title('Original Image')
    ax1.axis('off')
    ax2.imshow(y[n], cmap = plt.cm.bone)
    ax2.set_title('Masked Image')
    ax2.axis('off')
    plt.show()
In [ ]:
#X_train, y_train, X_test, y_test = splitData(X, masks,25000)
In [ ]:
X_train, X_test, y_train, y_test = train_test_split(X, masks, test_size=0.20, random_state=0)
In [ ]:
X_train.shape, y_train.shape, X_test.shape, y_test.shape
Out[ ]:
((661, 224, 224, 3), (661, 224, 224), (166, 224, 224, 3), (166, 224, 224))
In [ ]:
viewImage(1, X_train, y_train)
Clipping input data to the valid range for imshow with RGB data ([0..1] for floats or [0..255] for integers).
In [ ]:
viewImage(10, X_train, y_train)
Clipping input data to the valid range for imshow with RGB data ([0..1] for floats or [0..255] for integers).
In [ ]:
X_train.shape
Out[ ]:
(661, 224, 224, 3)

Model 1 : Predict Bounding Boxes

a) Mobile Net

In [ ]:
import time
In [ ]:
def createMobileNetModel(trainable=True):
    mobnet = MobileNet(input_shape=(IMAGE_HEIGHT, IMAGE_WIDTH, 3), include_top=False, alpha=1.0, weights="imagenet")

    for layer in mobnet.layers:
        layer.trainable = trainable

    block1 = mobnet.get_layer("conv_pw_5_relu").output
    block2 = mobnet.get_layer("conv_pw_11_relu").output
    block3 = mobnet.get_layer("conv_pw_13_relu").output

    x = Concatenate()([UpSampling2D()(block3), block2])
    x = Concatenate()([UpSampling2D()(x), block1])
    x = Conv2D(1, kernel_size=1, activation="sigmoid")(x)
    x = UpSampling2D()(x)
    x = UpSampling2D()(x)
    x = UpSampling2D()(x)
    
    x = Reshape((IMAGE_HEIGHT, IMAGE_HEIGHT))(x)

    return Model(inputs=mobnet.input, outputs=x)
In [ ]:
model = createMobileNetModel(False)
model.summary()
Model: "model_2"
__________________________________________________________________________________________________
Layer (type)                    Output Shape         Param #     Connected to                     
==================================================================================================
input_3 (InputLayer)            [(None, 224, 224, 3) 0                                            
__________________________________________________________________________________________________
conv1 (Conv2D)                  (None, 112, 112, 32) 864         input_3[0][0]                    
__________________________________________________________________________________________________
conv1_bn (BatchNormalization)   (None, 112, 112, 32) 128         conv1[0][0]                      
__________________________________________________________________________________________________
conv1_relu (ReLU)               (None, 112, 112, 32) 0           conv1_bn[0][0]                   
__________________________________________________________________________________________________
conv_dw_1 (DepthwiseConv2D)     (None, 112, 112, 32) 288         conv1_relu[0][0]                 
__________________________________________________________________________________________________
conv_dw_1_bn (BatchNormalizatio (None, 112, 112, 32) 128         conv_dw_1[0][0]                  
__________________________________________________________________________________________________
conv_dw_1_relu (ReLU)           (None, 112, 112, 32) 0           conv_dw_1_bn[0][0]               
__________________________________________________________________________________________________
conv_pw_1 (Conv2D)              (None, 112, 112, 64) 2048        conv_dw_1_relu[0][0]             
__________________________________________________________________________________________________
conv_pw_1_bn (BatchNormalizatio (None, 112, 112, 64) 256         conv_pw_1[0][0]                  
__________________________________________________________________________________________________
conv_pw_1_relu (ReLU)           (None, 112, 112, 64) 0           conv_pw_1_bn[0][0]               
__________________________________________________________________________________________________
conv_pad_2 (ZeroPadding2D)      (None, 113, 113, 64) 0           conv_pw_1_relu[0][0]             
__________________________________________________________________________________________________
conv_dw_2 (DepthwiseConv2D)     (None, 56, 56, 64)   576         conv_pad_2[0][0]                 
__________________________________________________________________________________________________
conv_dw_2_bn (BatchNormalizatio (None, 56, 56, 64)   256         conv_dw_2[0][0]                  
__________________________________________________________________________________________________
conv_dw_2_relu (ReLU)           (None, 56, 56, 64)   0           conv_dw_2_bn[0][0]               
__________________________________________________________________________________________________
conv_pw_2 (Conv2D)              (None, 56, 56, 128)  8192        conv_dw_2_relu[0][0]             
__________________________________________________________________________________________________
conv_pw_2_bn (BatchNormalizatio (None, 56, 56, 128)  512         conv_pw_2[0][0]                  
__________________________________________________________________________________________________
conv_pw_2_relu (ReLU)           (None, 56, 56, 128)  0           conv_pw_2_bn[0][0]               
__________________________________________________________________________________________________
conv_dw_3 (DepthwiseConv2D)     (None, 56, 56, 128)  1152        conv_pw_2_relu[0][0]             
__________________________________________________________________________________________________
conv_dw_3_bn (BatchNormalizatio (None, 56, 56, 128)  512         conv_dw_3[0][0]                  
__________________________________________________________________________________________________
conv_dw_3_relu (ReLU)           (None, 56, 56, 128)  0           conv_dw_3_bn[0][0]               
__________________________________________________________________________________________________
conv_pw_3 (Conv2D)              (None, 56, 56, 128)  16384       conv_dw_3_relu[0][0]             
__________________________________________________________________________________________________
conv_pw_3_bn (BatchNormalizatio (None, 56, 56, 128)  512         conv_pw_3[0][0]                  
__________________________________________________________________________________________________
conv_pw_3_relu (ReLU)           (None, 56, 56, 128)  0           conv_pw_3_bn[0][0]               
__________________________________________________________________________________________________
conv_pad_4 (ZeroPadding2D)      (None, 57, 57, 128)  0           conv_pw_3_relu[0][0]             
__________________________________________________________________________________________________
conv_dw_4 (DepthwiseConv2D)     (None, 28, 28, 128)  1152        conv_pad_4[0][0]                 
__________________________________________________________________________________________________
conv_dw_4_bn (BatchNormalizatio (None, 28, 28, 128)  512         conv_dw_4[0][0]                  
__________________________________________________________________________________________________
conv_dw_4_relu (ReLU)           (None, 28, 28, 128)  0           conv_dw_4_bn[0][0]               
__________________________________________________________________________________________________
conv_pw_4 (Conv2D)              (None, 28, 28, 256)  32768       conv_dw_4_relu[0][0]             
__________________________________________________________________________________________________
conv_pw_4_bn (BatchNormalizatio (None, 28, 28, 256)  1024        conv_pw_4[0][0]                  
__________________________________________________________________________________________________
conv_pw_4_relu (ReLU)           (None, 28, 28, 256)  0           conv_pw_4_bn[0][0]               
__________________________________________________________________________________________________
conv_dw_5 (DepthwiseConv2D)     (None, 28, 28, 256)  2304        conv_pw_4_relu[0][0]             
__________________________________________________________________________________________________
conv_dw_5_bn (BatchNormalizatio (None, 28, 28, 256)  1024        conv_dw_5[0][0]                  
__________________________________________________________________________________________________
conv_dw_5_relu (ReLU)           (None, 28, 28, 256)  0           conv_dw_5_bn[0][0]               
__________________________________________________________________________________________________
conv_pw_5 (Conv2D)              (None, 28, 28, 256)  65536       conv_dw_5_relu[0][0]             
__________________________________________________________________________________________________
conv_pw_5_bn (BatchNormalizatio (None, 28, 28, 256)  1024        conv_pw_5[0][0]                  
__________________________________________________________________________________________________
conv_pw_5_relu (ReLU)           (None, 28, 28, 256)  0           conv_pw_5_bn[0][0]               
__________________________________________________________________________________________________
conv_pad_6 (ZeroPadding2D)      (None, 29, 29, 256)  0           conv_pw_5_relu[0][0]             
__________________________________________________________________________________________________
conv_dw_6 (DepthwiseConv2D)     (None, 14, 14, 256)  2304        conv_pad_6[0][0]                 
__________________________________________________________________________________________________
conv_dw_6_bn (BatchNormalizatio (None, 14, 14, 256)  1024        conv_dw_6[0][0]                  
__________________________________________________________________________________________________
conv_dw_6_relu (ReLU)           (None, 14, 14, 256)  0           conv_dw_6_bn[0][0]               
__________________________________________________________________________________________________
conv_pw_6 (Conv2D)              (None, 14, 14, 512)  131072      conv_dw_6_relu[0][0]             
__________________________________________________________________________________________________
conv_pw_6_bn (BatchNormalizatio (None, 14, 14, 512)  2048        conv_pw_6[0][0]                  
__________________________________________________________________________________________________
conv_pw_6_relu (ReLU)           (None, 14, 14, 512)  0           conv_pw_6_bn[0][0]               
__________________________________________________________________________________________________
conv_dw_7 (DepthwiseConv2D)     (None, 14, 14, 512)  4608        conv_pw_6_relu[0][0]             
__________________________________________________________________________________________________
conv_dw_7_bn (BatchNormalizatio (None, 14, 14, 512)  2048        conv_dw_7[0][0]                  
__________________________________________________________________________________________________
conv_dw_7_relu (ReLU)           (None, 14, 14, 512)  0           conv_dw_7_bn[0][0]               
__________________________________________________________________________________________________
conv_pw_7 (Conv2D)              (None, 14, 14, 512)  262144      conv_dw_7_relu[0][0]             
__________________________________________________________________________________________________
conv_pw_7_bn (BatchNormalizatio (None, 14, 14, 512)  2048        conv_pw_7[0][0]                  
__________________________________________________________________________________________________
conv_pw_7_relu (ReLU)           (None, 14, 14, 512)  0           conv_pw_7_bn[0][0]               
__________________________________________________________________________________________________
conv_dw_8 (DepthwiseConv2D)     (None, 14, 14, 512)  4608        conv_pw_7_relu[0][0]             
__________________________________________________________________________________________________
conv_dw_8_bn (BatchNormalizatio (None, 14, 14, 512)  2048        conv_dw_8[0][0]                  
__________________________________________________________________________________________________
conv_dw_8_relu (ReLU)           (None, 14, 14, 512)  0           conv_dw_8_bn[0][0]               
__________________________________________________________________________________________________
conv_pw_8 (Conv2D)              (None, 14, 14, 512)  262144      conv_dw_8_relu[0][0]             
__________________________________________________________________________________________________
conv_pw_8_bn (BatchNormalizatio (None, 14, 14, 512)  2048        conv_pw_8[0][0]                  
__________________________________________________________________________________________________
conv_pw_8_relu (ReLU)           (None, 14, 14, 512)  0           conv_pw_8_bn[0][0]               
__________________________________________________________________________________________________
conv_dw_9 (DepthwiseConv2D)     (None, 14, 14, 512)  4608        conv_pw_8_relu[0][0]             
__________________________________________________________________________________________________
conv_dw_9_bn (BatchNormalizatio (None, 14, 14, 512)  2048        conv_dw_9[0][0]                  
__________________________________________________________________________________________________
conv_dw_9_relu (ReLU)           (None, 14, 14, 512)  0           conv_dw_9_bn[0][0]               
__________________________________________________________________________________________________
conv_pw_9 (Conv2D)              (None, 14, 14, 512)  262144      conv_dw_9_relu[0][0]             
__________________________________________________________________________________________________
conv_pw_9_bn (BatchNormalizatio (None, 14, 14, 512)  2048        conv_pw_9[0][0]                  
__________________________________________________________________________________________________
conv_pw_9_relu (ReLU)           (None, 14, 14, 512)  0           conv_pw_9_bn[0][0]               
__________________________________________________________________________________________________
conv_dw_10 (DepthwiseConv2D)    (None, 14, 14, 512)  4608        conv_pw_9_relu[0][0]             
__________________________________________________________________________________________________
conv_dw_10_bn (BatchNormalizati (None, 14, 14, 512)  2048        conv_dw_10[0][0]                 
__________________________________________________________________________________________________
conv_dw_10_relu (ReLU)          (None, 14, 14, 512)  0           conv_dw_10_bn[0][0]              
__________________________________________________________________________________________________
conv_pw_10 (Conv2D)             (None, 14, 14, 512)  262144      conv_dw_10_relu[0][0]            
__________________________________________________________________________________________________
conv_pw_10_bn (BatchNormalizati (None, 14, 14, 512)  2048        conv_pw_10[0][0]                 
__________________________________________________________________________________________________
conv_pw_10_relu (ReLU)          (None, 14, 14, 512)  0           conv_pw_10_bn[0][0]              
__________________________________________________________________________________________________
conv_dw_11 (DepthwiseConv2D)    (None, 14, 14, 512)  4608        conv_pw_10_relu[0][0]            
__________________________________________________________________________________________________
conv_dw_11_bn (BatchNormalizati (None, 14, 14, 512)  2048        conv_dw_11[0][0]                 
__________________________________________________________________________________________________
conv_dw_11_relu (ReLU)          (None, 14, 14, 512)  0           conv_dw_11_bn[0][0]              
__________________________________________________________________________________________________
conv_pw_11 (Conv2D)             (None, 14, 14, 512)  262144      conv_dw_11_relu[0][0]            
__________________________________________________________________________________________________
conv_pw_11_bn (BatchNormalizati (None, 14, 14, 512)  2048        conv_pw_11[0][0]                 
__________________________________________________________________________________________________
conv_pw_11_relu (ReLU)          (None, 14, 14, 512)  0           conv_pw_11_bn[0][0]              
__________________________________________________________________________________________________
conv_pad_12 (ZeroPadding2D)     (None, 15, 15, 512)  0           conv_pw_11_relu[0][0]            
__________________________________________________________________________________________________
conv_dw_12 (DepthwiseConv2D)    (None, 7, 7, 512)    4608        conv_pad_12[0][0]                
__________________________________________________________________________________________________
conv_dw_12_bn (BatchNormalizati (None, 7, 7, 512)    2048        conv_dw_12[0][0]                 
__________________________________________________________________________________________________
conv_dw_12_relu (ReLU)          (None, 7, 7, 512)    0           conv_dw_12_bn[0][0]              
__________________________________________________________________________________________________
conv_pw_12 (Conv2D)             (None, 7, 7, 1024)   524288      conv_dw_12_relu[0][0]            
__________________________________________________________________________________________________
conv_pw_12_bn (BatchNormalizati (None, 7, 7, 1024)   4096        conv_pw_12[0][0]                 
__________________________________________________________________________________________________
conv_pw_12_relu (ReLU)          (None, 7, 7, 1024)   0           conv_pw_12_bn[0][0]              
__________________________________________________________________________________________________
conv_dw_13 (DepthwiseConv2D)    (None, 7, 7, 1024)   9216        conv_pw_12_relu[0][0]            
__________________________________________________________________________________________________
conv_dw_13_bn (BatchNormalizati (None, 7, 7, 1024)   4096        conv_dw_13[0][0]                 
__________________________________________________________________________________________________
conv_dw_13_relu (ReLU)          (None, 7, 7, 1024)   0           conv_dw_13_bn[0][0]              
__________________________________________________________________________________________________
conv_pw_13 (Conv2D)             (None, 7, 7, 1024)   1048576     conv_dw_13_relu[0][0]            
__________________________________________________________________________________________________
conv_pw_13_bn (BatchNormalizati (None, 7, 7, 1024)   4096        conv_pw_13[0][0]                 
__________________________________________________________________________________________________
conv_pw_13_relu (ReLU)          (None, 7, 7, 1024)   0           conv_pw_13_bn[0][0]              
__________________________________________________________________________________________________
up_sampling2d_10 (UpSampling2D) (None, 14, 14, 1024) 0           conv_pw_13_relu[0][0]            
__________________________________________________________________________________________________
concatenate_7 (Concatenate)     (None, 14, 14, 1536) 0           up_sampling2d_10[0][0]           
                                                                 conv_pw_11_relu[0][0]            
__________________________________________________________________________________________________
up_sampling2d_11 (UpSampling2D) (None, 28, 28, 1536) 0           concatenate_7[0][0]              
__________________________________________________________________________________________________
concatenate_8 (Concatenate)     (None, 28, 28, 1792) 0           up_sampling2d_11[0][0]           
                                                                 conv_pw_5_relu[0][0]             
__________________________________________________________________________________________________
conv2d_2 (Conv2D)               (None, 28, 28, 1)    1793        concatenate_8[0][0]              
__________________________________________________________________________________________________
up_sampling2d_12 (UpSampling2D) (None, 56, 56, 1)    0           conv2d_2[0][0]                   
__________________________________________________________________________________________________
up_sampling2d_13 (UpSampling2D) (None, 112, 112, 1)  0           up_sampling2d_12[0][0]           
__________________________________________________________________________________________________
up_sampling2d_14 (UpSampling2D) (None, 224, 224, 1)  0           up_sampling2d_13[0][0]           
__________________________________________________________________________________________________
reshape_2 (Reshape)             (None, 224, 224)     0           up_sampling2d_14[0][0]           
==================================================================================================
Total params: 3,230,657
Trainable params: 1,793
Non-trainable params: 3,228,864
__________________________________________________________________________________________________
In [ ]:
def dice_coefficient(y_true, y_pred):
    numerator = 2 * tf.reduce_sum(y_true * y_pred)
    denominator = tf.reduce_sum(y_true + y_pred)

    return numerator / (denominator + tf.keras.backend.epsilon())
In [ ]:
def loss(y_true, y_pred):
    return binary_crossentropy(y_true, y_pred) - tf.keras.backend.log(dice_coefficient(y_true, y_pred) + tf.keras.backend.epsilon())
In [ ]:
optimizer = Adam(lr=1e-4, beta_1=0.9, beta_2=0.999, epsilon=None, decay=0.0, amsgrad=False)
model.compile(loss=loss, optimizer=optimizer, metrics=[dice_coefficient])
---------------------------------------------------------------------------
NameError                                 Traceback (most recent call last)
<ipython-input-90-188890cac41d> in <module>()
      1 optimizer = Adam(lr=1e-4, beta_1=0.9, beta_2=0.999, epsilon=None, decay=0.0, amsgrad=False)
----> 2 model.compile(loss=loss, optimizer=optimizer, metrics=[dice_coefficient])

NameError: name 'model' is not defined
In [ ]:
checkpoint = ModelCheckpoint("model-{val_loss:.2f}.h5", monitor="val_loss", verbose=1, save_best_only=True)

stop = EarlyStopping(monitor="val_loss", patience=5)

reduce_lr = ReduceLROnPlateau(monitor="val_loss", factor=0.2, patience=3, min_lr=1e-6, verbose=1)
In [ ]:
start = time.time()
In [ ]:
model.fit(X_train, y_train, epochs = 50, 
          batch_size = 1, callbacks = [checkpoint, reduce_lr, stop], validation_data = (X_test, y_test))
Epoch 1/50
661/661 [==============================] - 5s 6ms/step - loss: 9.8888 - dice_coefficient: 0.0737 - val_loss: 8.6226 - val_dice_coefficient: 0.1240

Epoch 00001: val_loss improved from inf to 8.62255, saving model to model-8.62.h5
Epoch 2/50
661/661 [==============================] - 3s 5ms/step - loss: 8.9156 - dice_coefficient: 0.1175 - val_loss: 8.5202 - val_dice_coefficient: 0.1382

Epoch 00002: val_loss improved from 8.62255 to 8.52023, saving model to model-8.52.h5
Epoch 3/50
661/661 [==============================] - 3s 5ms/step - loss: 8.9961 - dice_coefficient: 0.1360 - val_loss: 8.4916 - val_dice_coefficient: 0.1518

Epoch 00003: val_loss improved from 8.52023 to 8.49155, saving model to model-8.49.h5
Epoch 4/50
661/661 [==============================] - 3s 5ms/step - loss: 8.8442 - dice_coefficient: 0.1493 - val_loss: 8.4599 - val_dice_coefficient: 0.1531

Epoch 00004: val_loss improved from 8.49155 to 8.45988, saving model to model-8.46.h5
Epoch 5/50
661/661 [==============================] - 3s 5ms/step - loss: 9.2442 - dice_coefficient: 0.1444 - val_loss: 8.4485 - val_dice_coefficient: 0.1522

Epoch 00005: val_loss improved from 8.45988 to 8.44849, saving model to model-8.45.h5
Epoch 6/50
661/661 [==============================] - 3s 5ms/step - loss: 9.6632 - dice_coefficient: 0.1396 - val_loss: 8.4359 - val_dice_coefficient: 0.1599

Epoch 00006: val_loss improved from 8.44849 to 8.43591, saving model to model-8.44.h5
Epoch 7/50
661/661 [==============================] - 4s 5ms/step - loss: 9.1312 - dice_coefficient: 0.1595 - val_loss: 8.4313 - val_dice_coefficient: 0.1575

Epoch 00007: val_loss improved from 8.43591 to 8.43133, saving model to model-8.43.h5
Epoch 8/50
661/661 [==============================] - 3s 5ms/step - loss: 9.4222 - dice_coefficient: 0.1547 - val_loss: 8.4251 - val_dice_coefficient: 0.1606

Epoch 00008: val_loss improved from 8.43133 to 8.42512, saving model to model-8.43.h5
Epoch 9/50
661/661 [==============================] - 3s 5ms/step - loss: 9.0313 - dice_coefficient: 0.1698 - val_loss: 8.4223 - val_dice_coefficient: 0.1622

Epoch 00009: val_loss improved from 8.42512 to 8.42229, saving model to model-8.42.h5
Epoch 10/50
661/661 [==============================] - 3s 5ms/step - loss: 8.6575 - dice_coefficient: 0.1892 - val_loss: 8.4234 - val_dice_coefficient: 0.1603

Epoch 00010: val_loss did not improve from 8.42229
Epoch 11/50
661/661 [==============================] - 3s 5ms/step - loss: 9.3431 - dice_coefficient: 0.1685 - val_loss: 8.4257 - val_dice_coefficient: 0.1593

Epoch 00011: val_loss did not improve from 8.42229
Epoch 12/50
661/661 [==============================] - 4s 6ms/step - loss: 8.8573 - dice_coefficient: 0.1788 - val_loss: 8.4240 - val_dice_coefficient: 0.1605

Epoch 00012: val_loss did not improve from 8.42229

Epoch 00012: ReduceLROnPlateau reducing learning rate to 1.9999999494757503e-05.
Epoch 13/50
661/661 [==============================] - 3s 5ms/step - loss: 9.6695 - dice_coefficient: 0.1641 - val_loss: 8.4244 - val_dice_coefficient: 0.1604

Epoch 00013: val_loss did not improve from 8.42229
Epoch 14/50
661/661 [==============================] - 3s 5ms/step - loss: 9.3687 - dice_coefficient: 0.1739 - val_loss: 8.4329 - val_dice_coefficient: 0.1572

Epoch 00014: val_loss did not improve from 8.42229
Out[ ]:
<tensorflow.python.keras.callbacks.History at 0x7f38a2e71630>
In [ ]:
print(f'Time: {time.time() - start}')
Time: 107.73850655555725
In [ ]:
scores = model.evaluate(X_test, y_test, verbose = 1)
6/6 [==============================] - 1s 46ms/step - loss: 1.5006 - dice_coefficient: 0.2539
In [ ]:
print("Accuracy: ", scores[1])
print("Loss: ", scores[0])
Accuracy:  0.2539321482181549
Loss:  1.5005985498428345
In [ ]:
y_pred = model.predict(X_test, verbose = 1)
6/6 [==============================] - 1s 53ms/step
In [ ]:
viewImage(3, X_test, y_pred)
Clipping input data to the valid range for imshow with RGB data ([0..1] for floats or [0..255] for integers).

b) Mobile Net with additional layers

In [ ]:
def conv_block_simple(prevlayer, filters, prefix, strides=(1, 1)):
    conv = Conv2D(filters, (3, 3), padding = 'same', kernel_initializer = 'he_normal', strides = strides, name = prefix + '_conv')(prevlayer)
    conv = BatchNormalization(name = prefix + 'BatchNormalization')(conv)
    conv = Activation('relu', name = prefix + 'ActivationLayer')(conv)
    return conv

def createMobileNetModel2(trainable=True):
    model = MobileNet(input_shape=(IMAGE_HEIGHT, IMAGE_WIDTH, 3), include_top=False, alpha=ALPHA, weights="imagenet")

    for layer in model.layers:
        layer.trainable = trainable
    block1 = model.get_layer('conv_pw_13_relu').output
    block2 = model.get_layer('conv_pw_11_relu').output
    block3 = model.get_layer('conv_pw_5_relu').output
    block4 = model.get_layer('conv_pw_3_relu').output
    block5 = model.get_layer('conv_pw_1_relu').output

    up1 = Concatenate()([UpSampling2D()(block1), block2])
    conv6 = conv_block_simple(up1, 256, 'Conv_6_1')
    conv6 = conv_block_simple(conv6, 256, 'Conv_6_2')

    up2 = Concatenate()([UpSampling2D()(conv6), block3])
    conv7 = conv_block_simple(up2, 256, 'Conv_7_1')
    conv7 = conv_block_simple(conv7, 256, 'Conv_7_2')

    up3 = Concatenate()([UpSampling2D()(conv7), block4])
    conv8 = conv_block_simple(up3, 192, 'Conv_8_1')
    conv8 = conv_block_simple(conv8, 128, 'Conv_8_2')

    up4 = Concatenate()([UpSampling2D()(conv8), block5])
    conv9 = conv_block_simple(up4, 96, 'Conv_9_1')
    conv9 = conv_block_simple(conv9, 64, 'Conv_9_2')

    up5 = Concatenate()([UpSampling2D()(conv9), model.input])
    conv10 = conv_block_simple(up5, 48, 'Conv_10_1')
    conv10 = conv_block_simple(conv10, 32, 'Conv_10_2')
    conv10 = SpatialDropout2D(0.2)(conv10)
    
    x = Conv2D(1, (1, 1), activation = 'sigmoid')(conv10)
    x = Reshape((IMAGE_HEIGHT, IMAGE_HEIGHT))(x)
    return Model(inputs = model.input, outputs = x)
In [ ]:
model = createMobileNetModel2(False)
model.summary()
Model: "model_3"
__________________________________________________________________________________________________
Layer (type)                    Output Shape         Param #     Connected to                     
==================================================================================================
input_4 (InputLayer)            [(None, 224, 224, 3) 0                                            
__________________________________________________________________________________________________
conv1 (Conv2D)                  (None, 112, 112, 32) 864         input_4[0][0]                    
__________________________________________________________________________________________________
conv1_bn (BatchNormalization)   (None, 112, 112, 32) 128         conv1[0][0]                      
__________________________________________________________________________________________________
conv1_relu (ReLU)               (None, 112, 112, 32) 0           conv1_bn[0][0]                   
__________________________________________________________________________________________________
conv_dw_1 (DepthwiseConv2D)     (None, 112, 112, 32) 288         conv1_relu[0][0]                 
__________________________________________________________________________________________________
conv_dw_1_bn (BatchNormalizatio (None, 112, 112, 32) 128         conv_dw_1[0][0]                  
__________________________________________________________________________________________________
conv_dw_1_relu (ReLU)           (None, 112, 112, 32) 0           conv_dw_1_bn[0][0]               
__________________________________________________________________________________________________
conv_pw_1 (Conv2D)              (None, 112, 112, 64) 2048        conv_dw_1_relu[0][0]             
__________________________________________________________________________________________________
conv_pw_1_bn (BatchNormalizatio (None, 112, 112, 64) 256         conv_pw_1[0][0]                  
__________________________________________________________________________________________________
conv_pw_1_relu (ReLU)           (None, 112, 112, 64) 0           conv_pw_1_bn[0][0]               
__________________________________________________________________________________________________
conv_pad_2 (ZeroPadding2D)      (None, 113, 113, 64) 0           conv_pw_1_relu[0][0]             
__________________________________________________________________________________________________
conv_dw_2 (DepthwiseConv2D)     (None, 56, 56, 64)   576         conv_pad_2[0][0]                 
__________________________________________________________________________________________________
conv_dw_2_bn (BatchNormalizatio (None, 56, 56, 64)   256         conv_dw_2[0][0]                  
__________________________________________________________________________________________________
conv_dw_2_relu (ReLU)           (None, 56, 56, 64)   0           conv_dw_2_bn[0][0]               
__________________________________________________________________________________________________
conv_pw_2 (Conv2D)              (None, 56, 56, 128)  8192        conv_dw_2_relu[0][0]             
__________________________________________________________________________________________________
conv_pw_2_bn (BatchNormalizatio (None, 56, 56, 128)  512         conv_pw_2[0][0]                  
__________________________________________________________________________________________________
conv_pw_2_relu (ReLU)           (None, 56, 56, 128)  0           conv_pw_2_bn[0][0]               
__________________________________________________________________________________________________
conv_dw_3 (DepthwiseConv2D)     (None, 56, 56, 128)  1152        conv_pw_2_relu[0][0]             
__________________________________________________________________________________________________
conv_dw_3_bn (BatchNormalizatio (None, 56, 56, 128)  512         conv_dw_3[0][0]                  
__________________________________________________________________________________________________
conv_dw_3_relu (ReLU)           (None, 56, 56, 128)  0           conv_dw_3_bn[0][0]               
__________________________________________________________________________________________________
conv_pw_3 (Conv2D)              (None, 56, 56, 128)  16384       conv_dw_3_relu[0][0]             
__________________________________________________________________________________________________
conv_pw_3_bn (BatchNormalizatio (None, 56, 56, 128)  512         conv_pw_3[0][0]                  
__________________________________________________________________________________________________
conv_pw_3_relu (ReLU)           (None, 56, 56, 128)  0           conv_pw_3_bn[0][0]               
__________________________________________________________________________________________________
conv_pad_4 (ZeroPadding2D)      (None, 57, 57, 128)  0           conv_pw_3_relu[0][0]             
__________________________________________________________________________________________________
conv_dw_4 (DepthwiseConv2D)     (None, 28, 28, 128)  1152        conv_pad_4[0][0]                 
__________________________________________________________________________________________________
conv_dw_4_bn (BatchNormalizatio (None, 28, 28, 128)  512         conv_dw_4[0][0]                  
__________________________________________________________________________________________________
conv_dw_4_relu (ReLU)           (None, 28, 28, 128)  0           conv_dw_4_bn[0][0]               
__________________________________________________________________________________________________
conv_pw_4 (Conv2D)              (None, 28, 28, 256)  32768       conv_dw_4_relu[0][0]             
__________________________________________________________________________________________________
conv_pw_4_bn (BatchNormalizatio (None, 28, 28, 256)  1024        conv_pw_4[0][0]                  
__________________________________________________________________________________________________
conv_pw_4_relu (ReLU)           (None, 28, 28, 256)  0           conv_pw_4_bn[0][0]               
__________________________________________________________________________________________________
conv_dw_5 (DepthwiseConv2D)     (None, 28, 28, 256)  2304        conv_pw_4_relu[0][0]             
__________________________________________________________________________________________________
conv_dw_5_bn (BatchNormalizatio (None, 28, 28, 256)  1024        conv_dw_5[0][0]                  
__________________________________________________________________________________________________
conv_dw_5_relu (ReLU)           (None, 28, 28, 256)  0           conv_dw_5_bn[0][0]               
__________________________________________________________________________________________________
conv_pw_5 (Conv2D)              (None, 28, 28, 256)  65536       conv_dw_5_relu[0][0]             
__________________________________________________________________________________________________
conv_pw_5_bn (BatchNormalizatio (None, 28, 28, 256)  1024        conv_pw_5[0][0]                  
__________________________________________________________________________________________________
conv_pw_5_relu (ReLU)           (None, 28, 28, 256)  0           conv_pw_5_bn[0][0]               
__________________________________________________________________________________________________
conv_pad_6 (ZeroPadding2D)      (None, 29, 29, 256)  0           conv_pw_5_relu[0][0]             
__________________________________________________________________________________________________
conv_dw_6 (DepthwiseConv2D)     (None, 14, 14, 256)  2304        conv_pad_6[0][0]                 
__________________________________________________________________________________________________
conv_dw_6_bn (BatchNormalizatio (None, 14, 14, 256)  1024        conv_dw_6[0][0]                  
__________________________________________________________________________________________________
conv_dw_6_relu (ReLU)           (None, 14, 14, 256)  0           conv_dw_6_bn[0][0]               
__________________________________________________________________________________________________
conv_pw_6 (Conv2D)              (None, 14, 14, 512)  131072      conv_dw_6_relu[0][0]             
__________________________________________________________________________________________________
conv_pw_6_bn (BatchNormalizatio (None, 14, 14, 512)  2048        conv_pw_6[0][0]                  
__________________________________________________________________________________________________
conv_pw_6_relu (ReLU)           (None, 14, 14, 512)  0           conv_pw_6_bn[0][0]               
__________________________________________________________________________________________________
conv_dw_7 (DepthwiseConv2D)     (None, 14, 14, 512)  4608        conv_pw_6_relu[0][0]             
__________________________________________________________________________________________________
conv_dw_7_bn (BatchNormalizatio (None, 14, 14, 512)  2048        conv_dw_7[0][0]                  
__________________________________________________________________________________________________
conv_dw_7_relu (ReLU)           (None, 14, 14, 512)  0           conv_dw_7_bn[0][0]               
__________________________________________________________________________________________________
conv_pw_7 (Conv2D)              (None, 14, 14, 512)  262144      conv_dw_7_relu[0][0]             
__________________________________________________________________________________________________
conv_pw_7_bn (BatchNormalizatio (None, 14, 14, 512)  2048        conv_pw_7[0][0]                  
__________________________________________________________________________________________________
conv_pw_7_relu (ReLU)           (None, 14, 14, 512)  0           conv_pw_7_bn[0][0]               
__________________________________________________________________________________________________
conv_dw_8 (DepthwiseConv2D)     (None, 14, 14, 512)  4608        conv_pw_7_relu[0][0]             
__________________________________________________________________________________________________
conv_dw_8_bn (BatchNormalizatio (None, 14, 14, 512)  2048        conv_dw_8[0][0]                  
__________________________________________________________________________________________________
conv_dw_8_relu (ReLU)           (None, 14, 14, 512)  0           conv_dw_8_bn[0][0]               
__________________________________________________________________________________________________
conv_pw_8 (Conv2D)              (None, 14, 14, 512)  262144      conv_dw_8_relu[0][0]             
__________________________________________________________________________________________________
conv_pw_8_bn (BatchNormalizatio (None, 14, 14, 512)  2048        conv_pw_8[0][0]                  
__________________________________________________________________________________________________
conv_pw_8_relu (ReLU)           (None, 14, 14, 512)  0           conv_pw_8_bn[0][0]               
__________________________________________________________________________________________________
conv_dw_9 (DepthwiseConv2D)     (None, 14, 14, 512)  4608        conv_pw_8_relu[0][0]             
__________________________________________________________________________________________________
conv_dw_9_bn (BatchNormalizatio (None, 14, 14, 512)  2048        conv_dw_9[0][0]                  
__________________________________________________________________________________________________
conv_dw_9_relu (ReLU)           (None, 14, 14, 512)  0           conv_dw_9_bn[0][0]               
__________________________________________________________________________________________________
conv_pw_9 (Conv2D)              (None, 14, 14, 512)  262144      conv_dw_9_relu[0][0]             
__________________________________________________________________________________________________
conv_pw_9_bn (BatchNormalizatio (None, 14, 14, 512)  2048        conv_pw_9[0][0]                  
__________________________________________________________________________________________________
conv_pw_9_relu (ReLU)           (None, 14, 14, 512)  0           conv_pw_9_bn[0][0]               
__________________________________________________________________________________________________
conv_dw_10 (DepthwiseConv2D)    (None, 14, 14, 512)  4608        conv_pw_9_relu[0][0]             
__________________________________________________________________________________________________
conv_dw_10_bn (BatchNormalizati (None, 14, 14, 512)  2048        conv_dw_10[0][0]                 
__________________________________________________________________________________________________
conv_dw_10_relu (ReLU)          (None, 14, 14, 512)  0           conv_dw_10_bn[0][0]              
__________________________________________________________________________________________________
conv_pw_10 (Conv2D)             (None, 14, 14, 512)  262144      conv_dw_10_relu[0][0]            
__________________________________________________________________________________________________
conv_pw_10_bn (BatchNormalizati (None, 14, 14, 512)  2048        conv_pw_10[0][0]                 
__________________________________________________________________________________________________
conv_pw_10_relu (ReLU)          (None, 14, 14, 512)  0           conv_pw_10_bn[0][0]              
__________________________________________________________________________________________________
conv_dw_11 (DepthwiseConv2D)    (None, 14, 14, 512)  4608        conv_pw_10_relu[0][0]            
__________________________________________________________________________________________________
conv_dw_11_bn (BatchNormalizati (None, 14, 14, 512)  2048        conv_dw_11[0][0]                 
__________________________________________________________________________________________________
conv_dw_11_relu (ReLU)          (None, 14, 14, 512)  0           conv_dw_11_bn[0][0]              
__________________________________________________________________________________________________
conv_pw_11 (Conv2D)             (None, 14, 14, 512)  262144      conv_dw_11_relu[0][0]            
__________________________________________________________________________________________________
conv_pw_11_bn (BatchNormalizati (None, 14, 14, 512)  2048        conv_pw_11[0][0]                 
__________________________________________________________________________________________________
conv_pw_11_relu (ReLU)          (None, 14, 14, 512)  0           conv_pw_11_bn[0][0]              
__________________________________________________________________________________________________
conv_pad_12 (ZeroPadding2D)     (None, 15, 15, 512)  0           conv_pw_11_relu[0][0]            
__________________________________________________________________________________________________
conv_dw_12 (DepthwiseConv2D)    (None, 7, 7, 512)    4608        conv_pad_12[0][0]                
__________________________________________________________________________________________________
conv_dw_12_bn (BatchNormalizati (None, 7, 7, 512)    2048        conv_dw_12[0][0]                 
__________________________________________________________________________________________________
conv_dw_12_relu (ReLU)          (None, 7, 7, 512)    0           conv_dw_12_bn[0][0]              
__________________________________________________________________________________________________
conv_pw_12 (Conv2D)             (None, 7, 7, 1024)   524288      conv_dw_12_relu[0][0]            
__________________________________________________________________________________________________
conv_pw_12_bn (BatchNormalizati (None, 7, 7, 1024)   4096        conv_pw_12[0][0]                 
__________________________________________________________________________________________________
conv_pw_12_relu (ReLU)          (None, 7, 7, 1024)   0           conv_pw_12_bn[0][0]              
__________________________________________________________________________________________________
conv_dw_13 (DepthwiseConv2D)    (None, 7, 7, 1024)   9216        conv_pw_12_relu[0][0]            
__________________________________________________________________________________________________
conv_dw_13_bn (BatchNormalizati (None, 7, 7, 1024)   4096        conv_dw_13[0][0]                 
__________________________________________________________________________________________________
conv_dw_13_relu (ReLU)          (None, 7, 7, 1024)   0           conv_dw_13_bn[0][0]              
__________________________________________________________________________________________________
conv_pw_13 (Conv2D)             (None, 7, 7, 1024)   1048576     conv_dw_13_relu[0][0]            
__________________________________________________________________________________________________
conv_pw_13_bn (BatchNormalizati (None, 7, 7, 1024)   4096        conv_pw_13[0][0]                 
__________________________________________________________________________________________________
conv_pw_13_relu (ReLU)          (None, 7, 7, 1024)   0           conv_pw_13_bn[0][0]              
__________________________________________________________________________________________________
up_sampling2d_15 (UpSampling2D) (None, 14, 14, 1024) 0           conv_pw_13_relu[0][0]            
__________________________________________________________________________________________________
concatenate_9 (Concatenate)     (None, 14, 14, 1536) 0           up_sampling2d_15[0][0]           
                                                                 conv_pw_11_relu[0][0]            
__________________________________________________________________________________________________
Conv_6_1_conv (Conv2D)          (None, 14, 14, 256)  3539200     concatenate_9[0][0]              
__________________________________________________________________________________________________
Conv_6_1BatchNormalization (Bat (None, 14, 14, 256)  1024        Conv_6_1_conv[0][0]              
__________________________________________________________________________________________________
Conv_6_1ActivationLayer (Activa (None, 14, 14, 256)  0           Conv_6_1BatchNormalization[0][0] 
__________________________________________________________________________________________________
Conv_6_2_conv (Conv2D)          (None, 14, 14, 256)  590080      Conv_6_1ActivationLayer[0][0]    
__________________________________________________________________________________________________
Conv_6_2BatchNormalization (Bat (None, 14, 14, 256)  1024        Conv_6_2_conv[0][0]              
__________________________________________________________________________________________________
Conv_6_2ActivationLayer (Activa (None, 14, 14, 256)  0           Conv_6_2BatchNormalization[0][0] 
__________________________________________________________________________________________________
up_sampling2d_16 (UpSampling2D) (None, 28, 28, 256)  0           Conv_6_2ActivationLayer[0][0]    
__________________________________________________________________________________________________
concatenate_10 (Concatenate)    (None, 28, 28, 512)  0           up_sampling2d_16[0][0]           
                                                                 conv_pw_5_relu[0][0]             
__________________________________________________________________________________________________
Conv_7_1_conv (Conv2D)          (None, 28, 28, 256)  1179904     concatenate_10[0][0]             
__________________________________________________________________________________________________
Conv_7_1BatchNormalization (Bat (None, 28, 28, 256)  1024        Conv_7_1_conv[0][0]              
__________________________________________________________________________________________________
Conv_7_1ActivationLayer (Activa (None, 28, 28, 256)  0           Conv_7_1BatchNormalization[0][0] 
__________________________________________________________________________________________________
Conv_7_2_conv (Conv2D)          (None, 28, 28, 256)  590080      Conv_7_1ActivationLayer[0][0]    
__________________________________________________________________________________________________
Conv_7_2BatchNormalization (Bat (None, 28, 28, 256)  1024        Conv_7_2_conv[0][0]              
__________________________________________________________________________________________________
Conv_7_2ActivationLayer (Activa (None, 28, 28, 256)  0           Conv_7_2BatchNormalization[0][0] 
__________________________________________________________________________________________________
up_sampling2d_17 (UpSampling2D) (None, 56, 56, 256)  0           Conv_7_2ActivationLayer[0][0]    
__________________________________________________________________________________________________
concatenate_11 (Concatenate)    (None, 56, 56, 384)  0           up_sampling2d_17[0][0]           
                                                                 conv_pw_3_relu[0][0]             
__________________________________________________________________________________________________
Conv_8_1_conv (Conv2D)          (None, 56, 56, 192)  663744      concatenate_11[0][0]             
__________________________________________________________________________________________________
Conv_8_1BatchNormalization (Bat (None, 56, 56, 192)  768         Conv_8_1_conv[0][0]              
__________________________________________________________________________________________________
Conv_8_1ActivationLayer (Activa (None, 56, 56, 192)  0           Conv_8_1BatchNormalization[0][0] 
__________________________________________________________________________________________________
Conv_8_2_conv (Conv2D)          (None, 56, 56, 128)  221312      Conv_8_1ActivationLayer[0][0]    
__________________________________________________________________________________________________
Conv_8_2BatchNormalization (Bat (None, 56, 56, 128)  512         Conv_8_2_conv[0][0]              
__________________________________________________________________________________________________
Conv_8_2ActivationLayer (Activa (None, 56, 56, 128)  0           Conv_8_2BatchNormalization[0][0] 
__________________________________________________________________________________________________
up_sampling2d_18 (UpSampling2D) (None, 112, 112, 128 0           Conv_8_2ActivationLayer[0][0]    
__________________________________________________________________________________________________
concatenate_12 (Concatenate)    (None, 112, 112, 192 0           up_sampling2d_18[0][0]           
                                                                 conv_pw_1_relu[0][0]             
__________________________________________________________________________________________________
Conv_9_1_conv (Conv2D)          (None, 112, 112, 96) 165984      concatenate_12[0][0]             
__________________________________________________________________________________________________
Conv_9_1BatchNormalization (Bat (None, 112, 112, 96) 384         Conv_9_1_conv[0][0]              
__________________________________________________________________________________________________
Conv_9_1ActivationLayer (Activa (None, 112, 112, 96) 0           Conv_9_1BatchNormalization[0][0] 
__________________________________________________________________________________________________
Conv_9_2_conv (Conv2D)          (None, 112, 112, 64) 55360       Conv_9_1ActivationLayer[0][0]    
__________________________________________________________________________________________________
Conv_9_2BatchNormalization (Bat (None, 112, 112, 64) 256         Conv_9_2_conv[0][0]              
__________________________________________________________________________________________________
Conv_9_2ActivationLayer (Activa (None, 112, 112, 64) 0           Conv_9_2BatchNormalization[0][0] 
__________________________________________________________________________________________________
up_sampling2d_19 (UpSampling2D) (None, 224, 224, 64) 0           Conv_9_2ActivationLayer[0][0]    
__________________________________________________________________________________________________
concatenate_13 (Concatenate)    (None, 224, 224, 67) 0           up_sampling2d_19[0][0]           
                                                                 input_4[0][0]                    
__________________________________________________________________________________________________
Conv_10_1_conv (Conv2D)         (None, 224, 224, 48) 28992       concatenate_13[0][0]             
__________________________________________________________________________________________________
Conv_10_1BatchNormalization (Ba (None, 224, 224, 48) 192         Conv_10_1_conv[0][0]             
__________________________________________________________________________________________________
Conv_10_1ActivationLayer (Activ (None, 224, 224, 48) 0           Conv_10_1BatchNormalization[0][0]
__________________________________________________________________________________________________
Conv_10_2_conv (Conv2D)         (None, 224, 224, 32) 13856       Conv_10_1ActivationLayer[0][0]   
__________________________________________________________________________________________________
Conv_10_2BatchNormalization (Ba (None, 224, 224, 32) 128         Conv_10_2_conv[0][0]             
__________________________________________________________________________________________________
Conv_10_2ActivationLayer (Activ (None, 224, 224, 32) 0           Conv_10_2BatchNormalization[0][0]
__________________________________________________________________________________________________
spatial_dropout2d_1 (SpatialDro (None, 224, 224, 32) 0           Conv_10_2ActivationLayer[0][0]   
__________________________________________________________________________________________________
conv2d_3 (Conv2D)               (None, 224, 224, 1)  33          spatial_dropout2d_1[0][0]        
__________________________________________________________________________________________________
reshape_3 (Reshape)             (None, 224, 224)     0           conv2d_3[0][0]                   
==================================================================================================
Total params: 10,283,745
Trainable params: 7,051,713
Non-trainable params: 3,232,032
__________________________________________________________________________________________________
In [ ]:
start = time.time()
In [ ]:
model.compile(loss = loss, optimizer = optimizer, metrics = [dice_coefficient])
model.fit(X_train, y_train, 
           epochs = 50, batch_size = 1, callbacks = [checkpoint, reduce_lr, stop], validation_data = (X_test, y_test))
Epoch 1/30
661/661 [==============================] - 23s 30ms/step - loss: 9.7191 - dice_coefficient: 0.1097 - val_loss: 8.8091 - val_dice_coefficient: 0.1363

Epoch 00001: val_loss did not improve from 8.42229
Epoch 2/30
661/661 [==============================] - 19s 29ms/step - loss: 9.7213 - dice_coefficient: 0.1370 - val_loss: 8.7897 - val_dice_coefficient: 0.1414

Epoch 00002: val_loss did not improve from 8.42229
Epoch 3/30
661/661 [==============================] - 19s 29ms/step - loss: 9.1321 - dice_coefficient: 0.1867 - val_loss: 8.7460 - val_dice_coefficient: 0.1219

Epoch 00003: val_loss did not improve from 8.42229
Epoch 4/30
661/661 [==============================] - 19s 28ms/step - loss: 9.4797 - dice_coefficient: 0.2061 - val_loss: 8.7477 - val_dice_coefficient: 0.1327

Epoch 00004: val_loss did not improve from 8.42229
Epoch 5/30
661/661 [==============================] - 19s 28ms/step - loss: 8.8992 - dice_coefficient: 0.2465 - val_loss: 8.7640 - val_dice_coefficient: 0.1116

Epoch 00005: val_loss did not improve from 8.42229
Epoch 6/30
661/661 [==============================] - 19s 29ms/step - loss: 8.9616 - dice_coefficient: 0.2551 - val_loss: 8.8086 - val_dice_coefficient: 0.0993

Epoch 00006: val_loss did not improve from 8.42229

Epoch 00006: ReduceLROnPlateau reducing learning rate to 3.999999898951501e-06.
Epoch 7/30
661/661 [==============================] - 19s 29ms/step - loss: 8.9883 - dice_coefficient: 0.2793 - val_loss: 8.7959 - val_dice_coefficient: 0.1040

Epoch 00007: val_loss did not improve from 8.42229
Epoch 8/30
661/661 [==============================] - 19s 29ms/step - loss: 9.2605 - dice_coefficient: 0.2626 - val_loss: 8.8118 - val_dice_coefficient: 0.0981

Epoch 00008: val_loss did not improve from 8.42229
Out[ ]:
<tensorflow.python.keras.callbacks.History at 0x7f38a229cdd8>
In [ ]:
print(f'Time: {time.time() - start}')
Time: 162.6215844154358
In [ ]:
scores = model.evaluate(X_test, y_test, verbose = 1)
6/6 [==============================] - 4s 265ms/step - loss: 2.0750 - dice_coefficient: 0.1505
In [ ]:
print("Accuracy: ", scores[1])
print("Loss: ", scores[0])
Accuracy:  0.15050871670246124
Loss:  2.0750176906585693
In [ ]:
y_pred = model.predict(X_test, verbose = 1)
6/6 [==============================] - 1s 194ms/step
In [ ]:
viewImage(3, X_test, y_pred)
WARNING:matplotlib.image:Clipping input data to the valid range for imshow with RGB data ([0..1] for floats or [0..255] for integers).

Model 2: Predict Pneumonia

In [102]:
path_class_reduced_target = df_train_class_merged.sample(frac = 1) 
In [103]:
# A dataframe with paths, classes and targets
print('Prepare a dataframe with paths, classes and targets'); print('--'*40)
#path_class_target = path_class_reduced_target[['path', 'Target']].copy(deep = True)
train_class_updt_df.drop_duplicates(inplace = True)
path_class_reduced_df = train_class_updt_df.reset_index()
display(path_class_reduced_df.shape, path_class_reduced_df.nunique())
print('\nDistribution of target and classes')
display(path_class_reduced_df['Target'].value_counts())
Prepare a dataframe with paths, classes and targets
--------------------------------------------------------------------------------
(41343, 8)
level_0    41343
index      41343
path       33744
x1           886
y1           876
x2           900
y2           849
Target         2
dtype: int64
Distribution of target and classes
0    20672
1    20671
Name: Target, dtype: int64
In [104]:
path_class_reduced_df['Target'].value_counts()
Out[104]:
0    20672
1    20671
Name: Target, dtype: int64
In [105]:
path_class_reduced_df.head()
Out[105]:
level_0 index path x1 y1 x2 y2 Target
0 0 17964 JPG_train/7dd70f51-5d1b-49a8-9947-aabd95bcfb99... 0 0 0 0 0
1 1 31601 JPG_train_aug/aug_1_17398977-1b0f-4de4-9038-9c... 510 114 1121 693 1
2 2 29705 JPG_train/0ea1d0f4-e828-487f-9f29-06e094018054... 0 0 0 0 0
3 3 20737 JPG_train/9c52f20d-a5ea-45b2-a289-e6b7d2b614e5... 0 0 0 0 0
4 4 34763 JPG_train_aug/aug_1_3fcc1a52-4329-4c61-9a79-f6... 319 544 532 751 1

Added the below code to distribute the data for train,test and val

In [106]:
image_list_df_0 = path_class_reduced_df[path_class_reduced_target['Target']==0]
image_list_df_1 = path_class_reduced_df[path_class_reduced_target['Target']==1]
In [107]:
image_list_df_0.shape,image_list_df_1.shape
Out[107]:
((20672, 8), (20671, 8))
In [108]:
df_train_s=image_list_df_0[:round(image_list_df_0.shape[0]*0.7)]
df_train_0=df_train_s[:round(df_train_s.shape[0]*0.7)]
df_val_0=df_train_s[round(df_train_s.shape[0]*0.7):]
df_test_0=image_list_df_0[round(image_list_df_0.shape[0]*0.7):]
In [109]:
df_train_s.shape,df_train_0.shape,df_val_0.shape,df_test_0.shape
Out[109]:
((14470, 8), (10129, 8), (4341, 8), (6202, 8))
In [110]:
df_train_s=image_list_df_1[:round(image_list_df_1.shape[0]*0.7)]
df_train_1=df_train_s[:round(df_train_s.shape[0]*0.7)]
df_val_1=df_train_s[round(df_train_s.shape[0]*0.7):]
df_test_1=image_list_df_1[round(image_list_df_1.shape[0]*0.7):]
In [111]:
df_train_s.shape,df_train_1.shape,df_val_1.shape,df_test_1.shape
Out[111]:
((14470, 8), (10129, 8), (4341, 8), (6201, 8))
In [112]:
df_train=df_train_0.append(df_train_1, ignore_index=True)
df_val=df_val_0.append(df_val_1, ignore_index=True)
df_test=df_test_0.append(df_test_1, ignore_index=True)

From here the code remains same

In [113]:
df_train.shape
Out[113]:
(20258, 8)
In [114]:
df_train.head()
Out[114]:
level_0 index path x1 y1 x2 y2 Target
0 9555 552 JPG_train/12b4c5bc-48cb-4010-9b2e-eeb8b0d5cc8e... 568 338 797 595 1
1 9556 6479 JPG_train/b6251813-a489-4cf8-b9ae-ca95fcfccb4f... 236 135 436 655 1
2 9557 1022 JPG_train/3170c2b4-aa5a-4e6f-bf27-39ee0a2a416c... 202 480 413 703 1
3 9558 35833 JPG_train_aug/aug_1_5735ead4-b24d-411d-8151-e3... 559 258 753 445 1
4 9559 30032 JPG_train/2846f61a-9763-4ddc-80a9-a8701c974aa6... 0 0 0 0 0
In [115]:
df_train.info()
<class 'pandas.core.frame.DataFrame'>
RangeIndex: 20258 entries, 0 to 20257
Data columns (total 8 columns):
 #   Column   Non-Null Count  Dtype 
---  ------   --------------  ----- 
 0   level_0  20258 non-null  int64 
 1   index    20258 non-null  int64 
 2   path     20258 non-null  object
 3   x1       20258 non-null  object
 4   y1       20258 non-null  object
 5   x2       20258 non-null  object
 6   y2       20258 non-null  object
 7   Target   20258 non-null  object
dtypes: int64(2), object(6)
memory usage: 1.2+ MB
In [116]:
print('Training, Validation and Test set is ~equally distributed on target'); print('--'*40)
print('Distribution of target in the training set:'); 
display(pd.Series(df_train['Target']).value_counts(normalize = True).round(2))
print('\nDistribution of target in the validation set:'); 
display(pd.Series(df_val['Target']).value_counts(normalize = True).round(2))
print('\nDistribution of target in the test set:'); 
display(pd.Series(df_test['Target']).value_counts(normalize = True).round(2))
Training, Validation and Test set is ~equally distributed on target
--------------------------------------------------------------------------------
Distribution of target in the training set:
1    0.5
0    0.5
Name: Target, dtype: float64
Distribution of target in the validation set:
1    0.5
0    0.5
Name: Target, dtype: float64
Distribution of target in the test set:
0    0.5
1    0.5
Name: Target, dtype: float64
In [117]:
print('Save the train, valid and test dataframes for future use');print('--'*40)
df_train.to_pickle('train_data.pkl')
df_val.to_pickle('valid_data.pkl')
df_test.to_pickle('test_data.pkl')
Save the train, valid and test dataframes for future use
--------------------------------------------------------------------------------
In [118]:
# Data generator
class DataGenerators:
    def __init__(self, df_train, df_val, df_test, batch_size, path,
                 img_size = (224, 224), class_mode = 'binary',
                 random_state = 2020):
        self.df_train = df_train
        self.df_val = df_val
        self.df_test = df_test
        self.batch_size = batch_size
        self.img_size = img_size
        self.path = path
        self.class_mode = class_mode
        
        train_augmenter = ImageDataGenerator(
            preprocessing_function = preprocess_input,
            rotation_range = 20, width_shift_range = 0.2,
            height_shift_range = 0.2, zoom_range = 0.2,
            horizontal_flip = True, rescale = 1/255.
            )
        
        valid_augmenter = ImageDataGenerator(
            preprocessing_function = preprocess_input, 
            rescale = 1/255.
            )
        # Added the test code
        test_augmenter = ImageDataGenerator(
            preprocessing_function = preprocess_input,
            rescale = 1/255.
            )
        
        print('Train Generator Created', '--'*20)
        self.train_generator = train_augmenter.flow_from_dataframe(
            x_col = 'path',
            y_col = 'Target',
            dataframe = self.df_train,
            batch_size = self.batch_size,
            target_size = self.img_size,
            directory = self.path,
            class_mode = self.class_mode,
            seed = random_state,
            shuffle = True
            )
        print('Validation Generator Created', '--'*20)
        self.valid_generator = valid_augmenter.flow_from_dataframe(
            x_col = 'path',
            y_col = 'Target',
            dataframe = self.df_val,
            batch_size = self.batch_size,
            target_size = self.img_size,
            directory = self.path,
            class_mode = self.class_mode,
            seed = random_state,
            shuffle = False
            )
        print('Test Generator Created', '--'*20)
        self.test_generator = test_augmenter.flow_from_dataframe(
            x_col = 'path',
            y_col = 'Target',
            dataframe = self.df_test,
            batch_size = self.batch_size,
            target_size = self.img_size,
            directory = self.path,
            class_mode = self.class_mode,
            seed = random_state,
            shuffle = False
            )
        
        self.step_size_train = math.ceil(
            self.train_generator.n//self.train_generator.batch_size + 1
            )
        self.step_size_valid = math.ceil(
            self.valid_generator.n//self.valid_generator.batch_size + 1
            )
        self.step_size_test = math.ceil(
            self.test_generator.n//self.test_generator.batch_size + 1
            )
In [119]:
df_train['Target'] = df_train['Target'].astype(str); 
df_val['Target'] = df_val['Target'].astype(str); 
df_test['Target'] = df_test['Target'].astype(str)
In [120]:
df_train.head(3)
Out[120]:
level_0 index path x1 y1 x2 y2 Target
0 9555 552 JPG_train/12b4c5bc-48cb-4010-9b2e-eeb8b0d5cc8e... 568 338 797 595 1
1 9556 6479 JPG_train/b6251813-a489-4cf8-b9ae-ca95fcfccb4f... 236 135 436 655 1
2 9557 1022 JPG_train/3170c2b4-aa5a-4e6f-bf27-39ee0a2a416c... 202 480 413 703 1
In [121]:
#TRAIN_IMAGES_DIR = os.path.join('/Volumes/Ayon_Drive/GreatLearning/Capstone_Pneumonia/')
TRAIN_IMAGES_DIR = os.path.join('/content/gdrive/MyDrive/Colab_Notebooks/Capstone_Project')
In [122]:
TRAIN_IMAGES_DIR
Out[122]:
'/content/gdrive/MyDrive/Colab_Notebooks/Capstone_Project'
In [123]:
print('Create generators for training, validation and test dataframes'); print('--'*40)
generators = DataGenerators(df_train, df_val, df_test, 
                            batch_size = 32, 
                            path = TRAIN_IMAGES_DIR, 
                            img_size = (224, 224), 
                            class_mode = 'binary',
                            random_state = 2020)
Create generators for training, validation and test dataframes
--------------------------------------------------------------------------------
Train Generator Created ----------------------------------------
Found 20258 validated image filenames belonging to 2 classes.
Validation Generator Created ----------------------------------------
Found 8682 validated image filenames belonging to 2 classes.
Test Generator Created ----------------------------------------
Found 12403 validated image filenames belonging to 2 classes.
In [124]:
# ROC AUC as a Metric
# Reference: https://stackoverflow.com/questions/41032551/how-to-compute-receiving-operating-characteristic-roc-and-auc-in-keras
def roc_auc(y_true, y_pred):
    return tf.compat.v1.py_function(roc_auc_score, (y_true, y_pred), tf.double)

# Average Precision as a Metric
import tensorflow.keras.backend as K
def average_precision(y_true, y_pred):
    true_positives = K.sum(K.round(K.clip(y_true * y_pred, 0, 1)))
    predicted_positives = K.sum(K.round(K.clip(y_pred, 0, 1)))
    precision = true_positives / (predicted_positives + K.epsilon())
    return precision
# [Swathi] Added code for confusion matrix
def confusion_mtrx(y_test,y_pred,labels=[1,0]):
  cm=confusion_matrix(y_test,y_pred,labels)
  df_cm=pd.DataFrame(cm,index=[i for i in ["Actual 0","Actual 1"]],columns=[i for i in ["Predict 0","Predict 1"]])
  print("Confusion Matrix for Iteration1:\n\n{}".format(df_cm))

# F1 score as a Metric
# Reference: https://stackoverflow.com/questions/43547402/how-to-calculate-f1-macro-in-keras
def f1_score(y_true, y_pred):
    def recall(y_true, y_pred):
        true_positives = K.sum(K.round(K.clip(y_true * y_pred, 0, 1)))
        possible_positives = K.sum(K.round(K.clip(y_true, 0, 1)))
        recall = true_positives / (possible_positives + K.epsilon())
        return recall

    def precision(y_true, y_pred):
        true_positives = K.sum(K.round(K.clip(y_true * y_pred, 0, 1)))
        predicted_positives = K.sum(K.round(K.clip(y_pred, 0, 1)))
        precision = true_positives / (predicted_positives + K.epsilon())
        return precision
    precision = precision(y_true, y_pred)
    recall = recall(y_true, y_pred)
    return 2*((precision*recall)/(precision+recall+K.epsilon()))
In [125]:
df_model_results = pd.DataFrame(columns=
               ['Model_Name','Train_Accuracy','Validation_Accuracy','Test_Accuracy','Loss','Total_Time_Secs','f1_Score','epoch'])
In [126]:
# Model Parameters
BATCH_SIZE = 32
IMAGE_SIZE = 224
#EPOCH = 5
EPOCH = 10
LEARNING_RATE = 1e-4
MONITOR = 'val_loss'
MODE = 'min'
VERBOSE = 1
FACTOR = 0.1
PATIENCE = 3
COOLDOWN = 5
#FINAL_MODEL = 'best_densenet_final.h5'
LOG_FILE = 'logs.csv'
LOSS = 'binary_crossentropy'
METRICS = ['accuracy', average_precision, f1_score]
In [127]:
def build_densenet_model():
    print('Create a `DenseNet121` model'); print('--'*40)
    input_shape = (IMAGE_SIZE, IMAGE_SIZE, 3)
    inputs = Input(shape = input_shape)
    initializer = tf.keras.initializers.GlorotNormal()
    
    base_model = DenseNet121(include_top = False, input_tensor = inputs, weights ='imagenet')
    
    densenet = Model(inputs = inputs, outputs = base_model.layers[-1].output, name = 'DenseNet121')
    model = Sequential(name = 'DenseNet121')
    model.add(densenet)
    model.add(GlobalAveragePooling2D())
    model.add(Dropout(0.4))
    model.add(Dense(1, activation = 'sigmoid', kernel_initializer = initializer))
    
    model.summary()
    
    """
    for layer in base_model.layers[:-12]:
        layer.trainable = False
    model = base_model.output
    model = Flatten()(model)
    predictions = Dense(1, activation = 'relu')(model)
    model = Model(inputs=base_model.input, outputs=predictions)
    """
    
    
    
    
    return model

def build_vgg16_model():
    print('Create a VGG16 model'); print('--'*40)
    
    base_model = VGG16(include_top=False,
                  input_shape = (224,224,3),
                  weights = 'imagenet')

    #for layer in base_model.layers[:-12]:
    base_model.trainable = False
        
    model = base_model.output
    model = Flatten()(model)
    predictions = Dense(1, activation = 'relu')(model)
    model = Model(inputs=base_model.input, outputs=predictions)
    model.summary()
    
    return model


def build_vgg16_trainModel():
    print('Create a Trainable VGG16 model'); print('--'*40)
    
    input_shape = (IMAGE_SIZE, IMAGE_SIZE, 3)
    inputs = Input(shape = input_shape)
    
    base_model = VGG16(include_top=False,
                  input_shape = (224,224,3),
                  weights = 'imagenet')
    
    base_model.trainable = True
    for layer in base_model.layers[:15]:
        layer.trainable = False
        
    for idx, layer in enumerate(base_model.layers):
        print("layer {}: {}, trainable: {}".format(idx, layer.name, layer.trainable))
    
    """
    vgg16 = Model(inputs = inputs, outputs = base_model.layers[-1].output, name = 'VGG16')
    model = Sequential(name = 'VGG16')
    model.add(vgg16)
    model.add(BatchNormalization())
    model.add(Flatten())
    model.add(Dense(1, activation = 'sigmoid'))
    model.summary()
    """
    
    fine_tune_model = tf.keras.Sequential([
    base_model,
    tf.keras.layers.BatchNormalization(),
    tf.keras.layers.Flatten(),
    tf.keras.layers.Dense(2, activation='softmax'),
    ])

    #FINE_TUNE_LEARNING_RATE = LEARNING_RATE / 10

    #optimizer   = tf.keras.optimizers.Adam(learning_rate=FINE_TUNE_LEARNING_RATE)
    #loss        = 'categorical_crossentropy'
    #metrics     = ['accuracy']

    #fine_tune_model.compile(
    #    loss=loss,     
    #    optimizer=optimizer, 
    #    metrics=metrics
    #)

    fine_tune_model.summary()
    
    return model


def build_restnet_model():
    print('Create a Trainable RestNet model'); print('--'*40)
    
    base_model = ResNet50(include_top=False,
                  input_shape = (224,224,3),
                  weights = 'imagenet')

    #for layer in base_model.layers[:-12]:
    base_model.trainable = False 

    model = base_model.output
    model = Flatten()(model)
    predictions = Dense(1, activation = 'relu')(model)
    model = Model(inputs=base_model.input, outputs=predictions)
    model.summary()
    
    return model

## Debashis code
def build_restnet_trainModel():
    print('Create a RestNet model'); print('--'*40)
    
    base_model = ResNet50(include_top=False,
                  input_shape = (224,224,3),
                  weights = 'imagenet')

    for layer in base_model.layers[:-10]:
        layer.trainable = False

    model = base_model.output
    model = AveragePooling2D(pool_size = (4,4)) (model)
    model = Flatten()(model)
    model = Dense(256, activation = 'relu')(model)
    model = Dropout(0.4)(model)
    model = Dense(128, activation = 'relu')(model)
    model = Dropout(0.4)(model)
    model = Dense(64, activation = 'relu')(model)
    model = Dropout(0.4)(model)
    predictions = Dense(1, activation = 'softmax')(model)
    model = Model(inputs=base_model.input, outputs=predictions)
    model.summary()
    
    return model


def build_inceptionV3_model():
    print('Create a Inceptionv3 model'); print('--'*40)
    
    base_model = InceptionV3(include_top=False,
                  input_shape = (224,224,3),
                  weights = 'imagenet')
    base_model.trainable = False 
    model = base_model.output
    model = Flatten()(model)
    predictions = Dense(1, activation = 'relu')(model)
    model = Model(inputs=base_model.input, outputs=predictions)
    model.summary()
    
    return model



def callback_model(FINAL_MODEL):
    print("in call backs")
    lrscheduler = ReduceLROnPlateau(monitor = MONITOR, factor = FACTOR, 
                                    patience = PATIENCE, verbose = VERBOSE, 
                                    mode = MODE, cooldown = COOLDOWN)
    
    model.compile(optimizer = Adam(lr = LEARNING_RATE), loss = LOSS, metrics = METRICS)
    
    cp = ModelCheckpoint(filepath = MODEL_WEIGHTS + FINAL_MODEL, monitor = MONITOR, 
                         verbose = VERBOSE, save_best_only = True, mode = MODE)
    
    if os.path.exists(MODEL_WEIGHTS + LOG_FILE): os.remove(MODEL_WEIGHTS + LOG_FILE)
    csv_logger = CSVLogger(MODEL_WEIGHTS + LOG_FILE, append = True)
    
    callbacks = [cp, csv_logger, lrscheduler]
    return callbacks

def evaluateValidationData(model):
    
    ##Evaluate on validation data
    print('Evaluate the model on validation data'); print('--'*40)

    ##Prediction on validation data
    print('Predict on the validation data'); print('--'*40)
    validation_generator.reset()
    valid_pred_roc = model.predict_generator(generator = validation_generator,
                                             steps = generators.step_size_valid,
                                             verbose = 1)
    valid_pred = []
    for i in valid_pred_roc:
        if i >= 0.5: valid_pred.append(1)
        else: valid_pred.append(0)
    y_valid = df_val['Target'].astype(int).values
    x_valid = df_val['path']
    
    return valid_pred, y_valid, x_valid, valid_pred_roc

def evaluateTestData(y_valid):
    
    ##Prediction on test data
    print('Predict on the test data'); print('--'*40)
    test_generator.reset()
    test_pred_roc = model.predict_generator(generator = test_generator,
                                            steps = generators.step_size_test,
                                            verbose = 1)
    test_pred = []
    for i in test_pred_roc:
        if i >= 0.5: test_pred.append(1)
        else: test_pred.append(0)
    y_test = df_test['Target'].astype(int).values
    x_test = df_test['path']
          
    display(pd.Series(y_valid).value_counts(), pd.Series(y_test).value_counts())
    
    correct = np.nonzero(test_pred == y_test)[0]
    incorrect = np.nonzero(test_pred != y_test)[0]
    percentage = ((correct.size)/(correct.size + incorrect.size)) * 100
    
    print("Correctly predicted %d images out of %d images" %(correct.size, correct.size+incorrect.size))
    print("Predicted %.0f%% test images correctly" %(correct.size/(correct.size+incorrect.size)*100))
    
    return test_pred, y_test, x_test, correct, incorrect, test_pred_roc    

def evaluateROC(valid_pred_roc,test_pred_roc,y_valid,y_test):
    
    print('ROC Curve for the validation data'); print('--'*40)

    roc_auc_valid = roc_auc_score(y_valid, np.array(valid_pred_roc).reshape(-1))
    print('AUC: {:0.3f}'.format(roc_auc_valid))

    fig = plt.figure(figsize = (10, 7.2))
    fpr, tpr, thresholds = roc_curve(y_valid, np.array(valid_pred_roc).reshape(-1))
    plt.title('ROC Curve for the validation data')
    plt.ylabel('True Positive Rate')
    plt.xlabel('False Positive Rate')
    plt.axis([0, 1, 0, 1])
    plt.plot([0, 1], [0, 1], linestyle = '--', label = 'No Skill')
    plt.plot(fpr, tpr, marker = '.', label = 'ROC curve (area = %0.3f)' % roc_auc_valid)
    plt.legend(loc = 'lower right')
    plt.show()
          
    print('ROC Curve for the test data'); print('--'*40)

    roc_auc_test = roc_auc_score(y_test, np.array(test_pred_roc).reshape(-1))
    print('AUC: {:0.3f}'.format(roc_auc_test))

    fig = plt.figure(figsize = (10, 7.2))
    fpr, tpr, thresholds = roc_curve(y_test, np.array(test_pred_roc).reshape(-1))
    plt.title('ROC Curve for the test data')
    plt.ylabel('True Positive Rate')
    plt.xlabel('False Positive Rate')
    plt.axis([0, 1, 0, 1])
    plt.plot([0, 1], [0, 1], linestyle = '--', label = 'Random')
    plt.plot(fpr, tpr, marker = '.', label = 'ROC curve (area = %0.3f)' % roc_auc_test)
    plt.legend(loc = 'lower right')
    plt.show()
          
    print('Classification Report on the test data'); print('--'*60)
    print(classification_report(y_test, test_pred, target_names = ['Normal', 'Pneumonia']))
          
    print('Classification Report on the validation data'); print('--'*60)
    print(classification_report(y_valid, valid_pred, target_names = ['Normal', 'Pneumonia']))
    
def populateModelResults(df_model_results, modelName, accuracy_Train, accuracy_Valid, 
                                                 accuracy_Test, loss, totalTime, testF1,epoch):
    df_model_results = df_model_results.append({'Model_Name':modelName,'Train_Accuracy':accuracy_Train,
                        'Validation_Accuracy':accuracy_Valid, 'Test_Accuracy':accuracy_Test, 'Loss':loss, 
                                                'Total_Time_Secs':totalTime, 'f1_Score':testF1,'epoch':epoch}, ignore_index=True)
    return df_model_results
In [128]:
MODEL_WEIGHTS = os.path.join('model_weights/')
if not os.path.exists(MODEL_WEIGHTS): os.makedirs(MODEL_WEIGHTS)
In [129]:
import time
start = time.time()
In [130]:
print('Lets fit the model.....')
FINAL_MODEL = "densenet_final.h5"
K.clear_session()
model = build_densenet_model()
callbacks = callback_model(FINAL_MODEL)
train_generator = generators.train_generator
validation_generator = generators.valid_generator
test_generator = generators.test_generator    
history = model.fit_generator(generator = train_generator, 
                              steps_per_epoch = generators.step_size_train,
                              epochs = EPOCH, verbose = VERBOSE, 
                              callbacks = callbacks,
                              validation_data = validation_generator, 
                              validation_steps = generators.step_size_valid)
print('Save the final weights'); print('--'*40)
model.save(MODEL_WEIGHTS + FINAL_MODEL)
Lets fit the model.....
Create a `DenseNet121` model
--------------------------------------------------------------------------------
Downloading data from https://storage.googleapis.com/tensorflow/keras-applications/densenet/densenet121_weights_tf_dim_ordering_tf_kernels_notop.h5
29089792/29084464 [==============================] - 0s 0us/step
Model: "DenseNet121"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
DenseNet121 (Functional)     (None, 7, 7, 1024)        7037504   
_________________________________________________________________
global_average_pooling2d (Gl (None, 1024)              0         
_________________________________________________________________
dropout (Dropout)            (None, 1024)              0         
_________________________________________________________________
dense (Dense)                (None, 1)                 1025      
=================================================================
Total params: 7,038,529
Trainable params: 6,954,881
Non-trainable params: 83,648
_________________________________________________________________
in call backs
Epoch 1/10
634/634 [==============================] - 3911s 6s/step - loss: 0.4246 - accuracy: 0.8107 - average_precision: 0.8294 - f1_score: 0.7953 - val_loss: 0.6372 - val_accuracy: 0.5779 - val_average_precision: 0.9299 - val_f1_score: 0.2650

Epoch 00001: val_loss improved from inf to 0.63723, saving model to model_weights/densenet_final.h5
Epoch 2/10
634/634 [==============================] - 576s 909ms/step - loss: 0.3120 - accuracy: 0.8615 - average_precision: 0.8668 - f1_score: 0.8552 - val_loss: 0.2995 - val_accuracy: 0.8639 - val_average_precision: 0.8803 - val_f1_score: 0.8560

Epoch 00002: val_loss improved from 0.63723 to 0.29951, saving model to model_weights/densenet_final.h5
Epoch 3/10
634/634 [==============================] - 573s 903ms/step - loss: 0.2929 - accuracy: 0.8676 - average_precision: 0.8793 - f1_score: 0.8632 - val_loss: 0.3862 - val_accuracy: 0.8345 - val_average_precision: 0.9493 - val_f1_score: 0.8046

Epoch 00003: val_loss did not improve from 0.29951
Epoch 4/10
634/634 [==============================] - 565s 891ms/step - loss: 0.2810 - accuracy: 0.8731 - average_precision: 0.8841 - f1_score: 0.8687 - val_loss: 0.3156 - val_accuracy: 0.8682 - val_average_precision: 0.8918 - val_f1_score: 0.8597

Epoch 00004: val_loss did not improve from 0.29951
Epoch 5/10
634/634 [==============================] - 566s 893ms/step - loss: 0.2792 - accuracy: 0.8789 - average_precision: 0.8939 - f1_score: 0.8751 - val_loss: 0.2783 - val_accuracy: 0.8736 - val_average_precision: 0.8975 - val_f1_score: 0.8654

Epoch 00005: val_loss improved from 0.29951 to 0.27828, saving model to model_weights/densenet_final.h5
Epoch 6/10
634/634 [==============================] - 569s 897ms/step - loss: 0.2722 - accuracy: 0.8774 - average_precision: 0.8897 - f1_score: 0.8736 - val_loss: 0.3199 - val_accuracy: 0.8637 - val_average_precision: 0.9367 - val_f1_score: 0.8461

Epoch 00006: val_loss did not improve from 0.27828
Epoch 7/10
634/634 [==============================] - 562s 887ms/step - loss: 0.2672 - accuracy: 0.8806 - average_precision: 0.8899 - f1_score: 0.8746 - val_loss: 0.2926 - val_accuracy: 0.8635 - val_average_precision: 0.8358 - val_f1_score: 0.8643

Epoch 00007: val_loss did not improve from 0.27828
Epoch 8/10
634/634 [==============================] - 563s 888ms/step - loss: 0.2527 - accuracy: 0.8872 - average_precision: 0.8940 - f1_score: 0.8812 - val_loss: 0.3089 - val_accuracy: 0.8636 - val_average_precision: 0.9016 - val_f1_score: 0.8525

Epoch 00008: val_loss did not improve from 0.27828

Epoch 00008: ReduceLROnPlateau reducing learning rate to 9.999999747378752e-06.
Epoch 9/10
634/634 [==============================] - 559s 882ms/step - loss: 0.2426 - accuracy: 0.8956 - average_precision: 0.9082 - f1_score: 0.8903 - val_loss: 0.2937 - val_accuracy: 0.8740 - val_average_precision: 0.9199 - val_f1_score: 0.8629

Epoch 00009: val_loss did not improve from 0.27828
Epoch 10/10
634/634 [==============================] - 569s 898ms/step - loss: 0.2290 - accuracy: 0.8971 - average_precision: 0.9073 - f1_score: 0.8944 - val_loss: 0.2991 - val_accuracy: 0.8723 - val_average_precision: 0.9184 - val_f1_score: 0.8611

Epoch 00010: val_loss did not improve from 0.27828
Save the final weights
--------------------------------------------------------------------------------
In [131]:
totalDensetTime = "{:.2f}".format(time.time() - start)
print(f'Time: %s secs' %totalDensetTime)
Time: 9134.04 secs
In [132]:
oss_Train, accuracy_Train, ap_Train, f1_Train = 0.0, 0.0, 0.0, 0.0
print('Evaluate the model on Train data'); print('--'*40)
train_generator.reset()
loss_Train, accuracy_Train, ap_Train, f1_Train = model.evaluate_generator(generator = train_generator, 
                                          steps = generators.step_size_train)
print(f'Loss: {round(loss_Train, 3)}, Accuracy: {round(float(accuracy_Train), 3)}, AP: {round(float(ap_Train), 3)}, F1 Score: {round(float(f1_Train), 3)}')
Evaluate the model on Train data
--------------------------------------------------------------------------------
Loss: 0.215, Accuracy: 0.903, AP: 0.906, F1 Score: 0.899
In [133]:
loss_Valid, accuracy_Valid, ap_Valid, f1_Valid = 0.0, 0.0, 0.0, 0.0
loss_Test, accuracy_Test, ap_Test, f1_Test = 0.0, 0.0, 0.0, 0.0
print('Evaluate the model on validation data'); print('--'*40)
validation_generator.reset()
loss_Valid, accuracy_Valid, ap_Valid, f1_Valid = model.evaluate_generator(generator = validation_generator, 
                                          steps = generators.step_size_valid)
print(f'Loss: {round(loss_Valid, 3)}, Accuracy: {round(float(accuracy_Valid), 3)}, AP: {round(float(ap_Valid), 3)}, F1 Score: {round(float(f1_Valid), 3)}')
Evaluate the model on validation data
--------------------------------------------------------------------------------
Loss: 0.299, Accuracy: 0.872, AP: 0.918, F1 Score: 0.861
In [134]:
print('\nEvaluate the model on test data'); print('--'*40)
test_generator.reset()
loss_Test, accuracy_Test, ap_Test, f1_Test = model.evaluate_generator(generator = test_generator, 
                                          steps = generators.step_size_test)
print(f'Loss: {round(loss_Test, 3)},Accuracy: {round(float(accuracy_Test), 3)} ,AP: {round(float(ap_Test), 3)},F1 Score: {round(float(f1_Test), 3)}')
Evaluate the model on test data
--------------------------------------------------------------------------------
Loss: 0.302,Accuracy: 0.871 ,AP: 0.904,F1 Score: 0.86
In [135]:
print(accuracy_Train, accuracy_Valid, accuracy_Test, loss_Test, totalDensetTime, f1_Test)
epochs=5
df_model_results = populateModelResults(df_model_results, "DenseNet121", accuracy_Train, accuracy_Valid, 
                                        accuracy_Test, loss_Test, totalDensetTime, f1_Test,10)
df_model_results.head(10)
0.9030506610870361 0.8722644448280334 0.870918333530426 0.3021591603755951 9134.04 0.8597314357757568
Out[135]:
Model_Name Train_Accuracy Validation_Accuracy Test_Accuracy Loss Total_Time_Secs f1_Score epoch
0 DenseNet121 0.903051 0.872264 0.870918 0.302159 9134.04 0.859731 5
In [136]:
valid_pred, y_valid, x_valid, valid_pred_roc = evaluateValidationData(model)
Evaluate the model on validation data
--------------------------------------------------------------------------------
Predict on the validation data
--------------------------------------------------------------------------------
272/272 [==============================] - 103s 372ms/step
In [137]:
test_pred, y_test, x_test, correct, incorrect,test_pred_roc = evaluateTestData(model)
Predict on the test data
--------------------------------------------------------------------------------
388/388 [==============================] - 143s 368ms/step
<tensorflow.python.keras.engine.sequential.Sequential object at 0x7eff8013b518>    1
dtype: int64
0    6257
1    6146
dtype: int64
Correctly predicted 10802 images out of 12403 images
Predicted 87% test images correctly

[Swathi]: called the function for confusion matrix

In [140]:
confusion_mtrx(test_pred,y_test,labels=[1,0])                                                     
Confusion Matrix for Iteration1:

          Predict 0  Predict 1
Actual 0       5085        540
Actual 1       1061       5717
In [141]:
def viewPredictedImage(correct):
    for c in correct[:4]:
        #print(c)
        f, ((ax1)) = plt.subplots(1, 1, figsize = (15, 8))
        #print(c)
        img = load_image(x_test[c])
        #print(c)
        img = cv2.resize(img, dsize=(IMAGE_HEIGHT, IMAGE_WIDTH), interpolation=cv2.INTER_CUBIC)
        ax1.imshow(img, cmap = plt.cm.bone)
        ax1.set_title("Predicted Class {},Actual Class {}".format(valid_pred[c], y_test[c]))
        ax1.axis('off')
        plt.show()
In [142]:
viewPredictedImage(correct)
In [143]:
viewPredictedImage(incorrect)
In [ ]:
evaluateROC(valid_pred_roc,test_pred_roc,y_valid,y_test)
ROC Curve for the validation data
--------------------------------------------------------------------------------
AUC: 0.952
ROC Curve for the test data
--------------------------------------------------------------------------------
AUC: 0.950

Model VGG 16 Updates

In [ ]:
start = time.time()
In [ ]:
print('Lets fit the Updated VGG16 model.....')
K.clear_session()
FINAL_MODEL = "Updated_VGG16_final.h5"
model = build_vgg16_trainModel()
callbacks = callback_model(FINAL_MODEL)
train_generator = generators.train_generator
validation_generator = generators.valid_generator
test_generator = generators.test_generator    
history = model.fit_generator(generator = train_generator, 
                              steps_per_epoch = generators.step_size_train,
                              epochs = EPOCH, verbose = VERBOSE, 
                              callbacks = callbacks,
                              validation_data = validation_generator, 
                              validation_steps = generators.step_size_valid)
print('Save the final weights'); print('--'*40)
model.save(MODEL_WEIGHTS + FINAL_MODEL)
Lets fit the Updated VGG16 model.....
Create a Trainable VGG16 model
--------------------------------------------------------------------------------
layer 0: input_2, trainable: False
layer 1: block1_conv1, trainable: False
layer 2: block1_conv2, trainable: False
layer 3: block1_pool, trainable: False
layer 4: block2_conv1, trainable: False
layer 5: block2_conv2, trainable: False
layer 6: block2_pool, trainable: False
layer 7: block3_conv1, trainable: False
layer 8: block3_conv2, trainable: False
layer 9: block3_conv3, trainable: False
layer 10: block3_pool, trainable: False
layer 11: block4_conv1, trainable: False
layer 12: block4_conv2, trainable: False
layer 13: block4_conv3, trainable: False
layer 14: block4_pool, trainable: False
layer 15: block5_conv1, trainable: True
layer 16: block5_conv2, trainable: True
layer 17: block5_conv3, trainable: True
layer 18: block5_pool, trainable: True
Model: "sequential"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
vgg16 (Model)                (None, 7, 7, 512)         14714688  
_________________________________________________________________
batch_normalization (BatchNo (None, 7, 7, 512)         2048      
_________________________________________________________________
flatten (Flatten)            (None, 25088)             0         
_________________________________________________________________
dense (Dense)                (None, 2)                 50178     
=================================================================
Total params: 14,766,914
Trainable params: 7,130,626
Non-trainable params: 7,636,288
_________________________________________________________________
in call backs
Epoch 1/5
12/13 [==========================>...] - ETA: 31s - loss: 0.3207 - accuracy: 0.8619 - average_precision: 0.8570 - f1_score: 0.8428 
Epoch 00001: val_loss improved from inf to 1.19791, saving model to model_weights/Updated_VGG16_final.h5
13/13 [==============================] - 432s 33s/step - loss: 0.3377 - accuracy: 0.8579 - average_precision: 0.8680 - f1_score: 0.8421 - val_loss: 1.1979 - val_accuracy: 0.4898 - val_average_precision: 0.4715 - val_f1_score: 0.6386
Epoch 2/5
12/13 [==========================>...] - ETA: 31s - loss: 0.3157 - accuracy: 0.8619 - average_precision: 0.8585 - f1_score: 0.8635 
Epoch 00002: val_loss improved from 1.19791 to 0.90305, saving model to model_weights/Updated_VGG16_final.h5
13/13 [==============================] - 430s 33s/step - loss: 0.3148 - accuracy: 0.8604 - average_precision: 0.8694 - f1_score: 0.8630 - val_loss: 0.9031 - val_accuracy: 0.4898 - val_average_precision: 0.4715 - val_f1_score: 0.6386
Epoch 3/5
12/13 [==========================>...] - ETA: 30s - loss: 0.2466 - accuracy: 0.8757 - average_precision: 0.8615 - f1_score: 0.8642 
Epoch 00003: val_loss improved from 0.90305 to 0.67884, saving model to model_weights/Updated_VGG16_final.h5
13/13 [==============================] - 418s 32s/step - loss: 0.2561 - accuracy: 0.8731 - average_precision: 0.8681 - f1_score: 0.8653 - val_loss: 0.6788 - val_accuracy: 0.6327 - val_average_precision: 0.9167 - val_f1_score: 0.4396
Epoch 4/5
12/13 [==========================>...] - ETA: 30s - loss: 0.1922 - accuracy: 0.9199 - average_precision: 0.9250 - f1_score: 0.9226
Epoch 00004: val_loss did not improve from 0.67884
13/13 [==============================] - 411s 32s/step - loss: 0.1919 - accuracy: 0.9213 - average_precision: 0.9205 - f1_score: 0.9230 - val_loss: 0.7770 - val_accuracy: 0.4898 - val_average_precision: 0.4715 - val_f1_score: 0.6386
Epoch 5/5
12/13 [==========================>...] - ETA: 30s - loss: 0.2158 - accuracy: 0.9227 - average_precision: 0.9399 - f1_score: 0.9256 
Epoch 00005: val_loss did not improve from 0.67884
13/13 [==============================] - 422s 32s/step - loss: 0.2178 - accuracy: 0.9239 - average_precision: 0.9445 - f1_score: 0.9254 - val_loss: 1.3299 - val_accuracy: 0.4898 - val_average_precision: 0.4715 - val_f1_score: 0.6386
Save the final weights
--------------------------------------------------------------------------------
In [ ]:
totalUpdtVGG16Time = "{:.2f}".format(time.time() - start)
print(f'Time: %s secs' %totalUpdtVGG16Time)
Time: 2120.38 secs
In [ ]:
loss_Train, accuracy_Train, ap_Train, f1_Train = 0.0, 0.0, 0.0, 0.0
print('Evaluate the model on Train data'); print('--'*40)
train_generator.reset()
loss_Train, accuracy_Train, ap_Train, f1_Train = model.evaluate_generator(generator = train_generator, 
                                          steps = generators.step_size_train)
print(f'Loss: {round(loss_Train, 3)}, Accuracy: {round(float(accuracy_Train), 3)}, AP: {round(float(ap_Train), 3)}, F1 Score: {round(float(f1_Train), 3)}')
Evaluate the model on Train data
--------------------------------------------------------------------------------
Loss: 1.278, Accuracy: 0.49, AP: 0.501, F1 Score: 0.665
In [ ]:
loss_Valid, accuracy_Valid, ap_Valid, f1_Valid = 0.0, 0.0, 0.0, 0.0
loss_Test, accuracy_Test, ap_Test, f1_Test = 0.0, 0.0, 0.0, 0.0
print('Evaluate the model on validation data'); print('--'*40)
validation_generator.reset()
loss_Valid, accuracy_Valid, ap_Valid, f1_Valid = model.evaluate_generator(generator = validation_generator, 
                                          steps = generators.step_size_valid)
#print(f'Loss: {round(loss, 3)}, Accuracy: {round(float(accuracy), 3)}, AP: {round(float(ap), 3)}, F1 Score: {round(float(f1), 3)}')
Evaluate the model on validation data
--------------------------------------------------------------------------------
In [ ]:
print('\nEvaluate the model on test data'); print('--'*40)
test_generator.reset()
loss_Test, accuracy_Test, ap_Test, f1_Test = model.evaluate_generator(generator = test_generator, 
                                          steps = generators.step_size_test)
print(f'Loss: {round(loss_Test, 3)},Accuracy: {round(float(accuracy_Test), 3)} ,AP: {round(float(ap_Test), 3)},F1 Score: {round(float(f1_Test), 3)}')
Evaluate the model on test data
--------------------------------------------------------------------------------
Loss: 1.382,Accuracy: 0.449 ,AP: 0.44,F1 Score: 0.611
In [ ]:
df_model_results = populateModelResults(df_model_results, "Updated VGG16", accuracy_Train, accuracy_Valid, 
                                        accuracy_Test, loss_Test, totalRestnetTime, f1_Test)
df_model_results.head(10)
Out[ ]:
Model_Name Train_Accuracy Validation_Accuracy Test_Accuracy Loss Total_Time_Secs f1_Score
0 DenseNet121 0.489848 0.489796 0.44898 0.927249 2651.06 0.610816
1 Updated VGG16 0.489848 0.489796 0.44898 1.381736 46.19 0.610816
In [ ]:
valid_pred, y_valid, x_valid, valid_pred_roc = evaluateValidationData(model)
Evaluate the model on validation data
--------------------------------------------------------------------------------
Loss: 1.33, Accuracy: 0.49,  AP: 0.472, F1 Score: 0.639
Predict on the validation data
--------------------------------------------------------------------------------
2/2 [==============================] - 17s 8s/step
In [ ]:
test_pred, y_test, x_test, correct, incorrect, test_pred_roc = evaluateTestData(y_valid)
Predict on the test data
--------------------------------------------------------------------------------
2/2 [==============================] - 19s 9s/step
0    25
1    24
dtype: int64
0    27
1    22
dtype: int64
Correctly predicted 22 images out of 49 images
Predicted 45% test images correctly
In [ ]:
viewPredictedImage(correct)
In [ ]:
viewPredictedImage(incorrect)
In [ ]:
evaluateROC(valid_pred_roc,test_pred_roc,y_valid,y_test)
ROC Curve for the validation data
--------------------------------------------------------------------------------
AUC: 0.667
ROC Curve for the test data
--------------------------------------------------------------------------------
AUC: 0.719
Classification Report on the test data
------------------------------------------------------------------------------------------------------------------------
              precision    recall  f1-score   support

      Normal       0.00      0.00      0.00        27
   Pneumonia       0.45      1.00      0.62        22

    accuracy                           0.45        49
   macro avg       0.22      0.50      0.31        49
weighted avg       0.20      0.45      0.28        49

Classification Report on the validation data
------------------------------------------------------------------------------------------------------------------------
              precision    recall  f1-score   support

      Normal       0.00      0.00      0.00        25
   Pneumonia       0.49      1.00      0.66        24

    accuracy                           0.49        49
   macro avg       0.24      0.50      0.33        49
weighted avg       0.24      0.49      0.32        49

Model 2 VGG 16

In [ ]:
start = time.time()
In [ ]:
print('Lets fit the VGG16 model.....')
K.clear_session()
FINAL_MODEL = "VGG16_final.h5"
model = build_vgg16_model()
callbacks = callback_model(FINAL_MODEL)
train_generator = generators.train_generator
validation_generator = generators.valid_generator
test_generator = generators.test_generator    
history = model.fit_generator(generator = train_generator, 
                              steps_per_epoch = generators.step_size_train,
                              epochs = EPOCH, verbose = VERBOSE, 
                              callbacks = callbacks,
                              validation_data = validation_generator, 
                              validation_steps = generators.step_size_valid)
print('Save the final weights'); print('--'*40)
model.save(MODEL_WEIGHTS + FINAL_MODEL)
Lets fit the VGG16 model.....
Create a VGG16 model
--------------------------------------------------------------------------------
Model: "model"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
input_1 (InputLayer)         [(None, 224, 224, 3)]     0         
_________________________________________________________________
block1_conv1 (Conv2D)        (None, 224, 224, 64)      1792      
_________________________________________________________________
block1_conv2 (Conv2D)        (None, 224, 224, 64)      36928     
_________________________________________________________________
block1_pool (MaxPooling2D)   (None, 112, 112, 64)      0         
_________________________________________________________________
block2_conv1 (Conv2D)        (None, 112, 112, 128)     73856     
_________________________________________________________________
block2_conv2 (Conv2D)        (None, 112, 112, 128)     147584    
_________________________________________________________________
block2_pool (MaxPooling2D)   (None, 56, 56, 128)       0         
_________________________________________________________________
block3_conv1 (Conv2D)        (None, 56, 56, 256)       295168    
_________________________________________________________________
block3_conv2 (Conv2D)        (None, 56, 56, 256)       590080    
_________________________________________________________________
block3_conv3 (Conv2D)        (None, 56, 56, 256)       590080    
_________________________________________________________________
block3_pool (MaxPooling2D)   (None, 28, 28, 256)       0         
_________________________________________________________________
block4_conv1 (Conv2D)        (None, 28, 28, 512)       1180160   
_________________________________________________________________
block4_conv2 (Conv2D)        (None, 28, 28, 512)       2359808   
_________________________________________________________________
block4_conv3 (Conv2D)        (None, 28, 28, 512)       2359808   
_________________________________________________________________
block4_pool (MaxPooling2D)   (None, 14, 14, 512)       0         
_________________________________________________________________
block5_conv1 (Conv2D)        (None, 14, 14, 512)       2359808   
_________________________________________________________________
block5_conv2 (Conv2D)        (None, 14, 14, 512)       2359808   
_________________________________________________________________
block5_conv3 (Conv2D)        (None, 14, 14, 512)       2359808   
_________________________________________________________________
block5_pool (MaxPooling2D)   (None, 7, 7, 512)         0         
_________________________________________________________________
flatten (Flatten)            (None, 25088)             0         
_________________________________________________________________
dense (Dense)                (None, 1)                 25089     
=================================================================
Total params: 14,739,777
Trainable params: 25,089
Non-trainable params: 14,714,688
_________________________________________________________________
in call backs
Epoch 1/5
12/13 [==========================>...] - ETA: 39s - loss: 7.3751 - accuracy: 0.5110 - average_precision: 0.0000e+00 - f1_score: 0.0000e+00 
Epoch 00001: val_loss improved from inf to 7.27298, saving model to model_weights/VGG16_final.h5
13/13 [==============================] - 531s 41s/step - loss: 7.4121 - accuracy: 0.5102 - average_precision: 0.0000e+00 - f1_score: 0.0000e+00 - val_loss: 7.2730 - val_accuracy: 0.5102 - val_average_precision: 0.0000e+00 - val_f1_score: 0.0000e+00
Epoch 2/5
12/13 [==========================>...] - ETA: 36s - loss: 7.1260 - accuracy: 0.5221 - average_precision: 0.0000e+00 - f1_score: 0.0000e+00 
Epoch 00002: val_loss did not improve from 7.27298
13/13 [==============================] - 499s 38s/step - loss: 7.3363 - accuracy: 0.5102 - average_precision: 0.0000e+00 - f1_score: 0.0000e+00 - val_loss: 7.2730 - val_accuracy: 0.5102 - val_average_precision: 0.0000e+00 - val_f1_score: 0.0000e+00
Epoch 3/5
12/13 [==========================>...] - ETA: 35s - loss: 7.5116 - accuracy: 0.5138 - average_precision: 0.0000e+00 - f1_score: 0.0000e+00 
Epoch 00003: val_loss did not improve from 7.27298
13/13 [==============================] - 476s 37s/step - loss: 7.5636 - accuracy: 0.5102 - average_precision: 0.0000e+00 - f1_score: 0.0000e+00 - val_loss: 7.2730 - val_accuracy: 0.5102 - val_average_precision: 0.0000e+00 - val_f1_score: 0.0000e+00
Epoch 4/5
12/13 [==========================>...] - ETA: 32s - loss: 7.6803 - accuracy: 0.5083 - average_precision: 0.0000e+00 - f1_score: 0.0000e+00 
Epoch 00004: val_loss did not improve from 7.27298

Epoch 00004: ReduceLROnPlateau reducing learning rate to 9.999999747378752e-06.
13/13 [==============================] - 443s 34s/step - loss: 7.6393 - accuracy: 0.5102 - average_precision: 0.0000e+00 - f1_score: 0.0000e+00 - val_loss: 7.2730 - val_accuracy: 0.5102 - val_average_precision: 0.0000e+00 - val_f1_score: 0.0000e+00
Epoch 5/5
12/13 [==========================>...] - ETA: 32s - loss: 7.4152 - accuracy: 0.5083 - average_precision: 0.0000e+00 - f1_score: 0.0000e+00 
Epoch 00005: val_loss did not improve from 7.27298
13/13 [==============================] - 447s 34s/step - loss: 7.4121 - accuracy: 0.5102 - average_precision: 0.0000e+00 - f1_score: 0.0000e+00 - val_loss: 7.2730 - val_accuracy: 0.5102 - val_average_precision: 0.0000e+00 - val_f1_score: 0.0000e+00
Save the final weights
--------------------------------------------------------------------------------
In [ ]:
totalVGG16Time = "{:.2f}".format(time.time() - start)
print(f'Time: %s secs' %totalVGG16Time)
Time: 2399.36 secs
In [ ]:
loss_Train, accuracy_Train, ap_Train, f1_Train = 0.0, 0.0, 0.0, 0.0
print('Evaluate the model on Train data'); print('--'*40)
train_generator.reset()
loss_Train, accuracy_Train, ap_Train, f1_Train = model.evaluate_generator(generator = train_generator, 
                                          steps = generators.step_size_train)
print(f'Loss: {float(loss_Train)}, Accuracy: {float(accuracy_Train)}, AP: {float(ap_Train)}, F1 Score: {float(f1_Train)}')
Evaluate the model on Train data
--------------------------------------------------------------------------------
Loss: 7.482583082639254, Accuracy: 0.510152280330658, AP: 0.0, F1 Score: 0.0
In [ ]:
loss_Valid, accuracy_Valid, ap_Valid, f1_Valid = 0.0, 0.0, 0.0, 0.0
loss_Test, accuracy_Test, ap_Test, f1_Test = 0.0, 0.0, 0.0, 0.0
print('Evaluate the model on validation data'); print('--'*40)
validation_generator.reset()
loss_Valid, accuracy_Valid, ap_Valid, f1_Valid = model.evaluate_generator(generator = validation_generator, 
                                          steps = generators.step_size_valid)
print(f'Loss: {float(loss_Valid)}, Accuracy: {float(accuracy_Valid)}, AP: {float(ap_Valid)}, F1 Score: {float(f1_Valid)}')
Evaluate the model on validation data
--------------------------------------------------------------------------------
Loss: 7.482583082639254, Accuracy: 0.510152280330658, AP: 0.0, F1 Score: 0.0
In [ ]:
print('\nEvaluate the model on test data'); print('--'*40)
test_generator.reset()
loss_Test, accuracy_Test, ap_Test, f1_Test = model.evaluate_generator(generator = test_generator, 
                                          steps = generators.step_size_test)
print(f'Loss: {float(loss_Test)}, Accuracy: {float(accuracy_Test)}, AP: {float(ap_Test)}, F1 Score: {float(f1_Test)}')
Evaluate the model on test data
--------------------------------------------------------------------------------
Loss: 6.790946960449219, Accuracy: 0.5510203838348389, AP: 0.0, F1 Score: 0.0
In [ ]:
test_pred, y_test, x_test, correct, incorrect, test_pred_roc = evaluateTestData(y_valid)
Predict on the test data
--------------------------------------------------------------------------------
2/2 [==============================] - 17s 9s/step
0    25
1    24
dtype: int64
0    27
1    22
dtype: int64
Correctly predicted 27 images out of 49 images
Predicted 55% test images correctly
In [ ]:
valid_pred, y_valid, x_valid, valid_pred_roc = evaluateValidationData(model)
Evaluate the model on validation data
--------------------------------------------------------------------------------
Loss: 7.273, Accuracy: 0.51,  AP: 0.0, F1 Score: 0.0
Predict on the validation data
--------------------------------------------------------------------------------
2/2 [==============================] - 25s 12s/step
In [ ]:
print(accuracy_Train, accuracy_Valid, accuracy_Test, loss_Test, totalVGG16Time, f1_Test)
df_model_results = populateModelResults(df_model_results, "VGG16", accuracy_Train, accuracy_Valid, 
                                        accuracy_Test, loss_Test, totalVGG16Time, f1_Test)
0.5101523 0.5102041 0.5510204 6.790946960449219 2399.36 0.0
In [ ]:
df_model_results.head(10)
Out[ ]:
Model_Name Train_Accuracy Validation_Accuracy Test_Accuracy Loss Total_Time_Secs f1_Score
0 DenseNet121 0.489848 0.489796 0.44898 0.927249 2651.06 0.610816
1 Updated VGG16 0.489848 0.489796 0.44898 1.381736 46.19 0.610816
2 VGG16 0.510152 0.510204 0.55102 6.790947 2399.36 0.000000
In [ ]:
viewPredictedImage(correct)
In [ ]:
viewPredictedImage(incorrect)
In [ ]:
evaluateROC(valid_pred_roc,test_pred_roc,y_valid,y_test)
ROC Curve for the validation data
--------------------------------------------------------------------------------
AUC: 0.500
ROC Curve for the test data
--------------------------------------------------------------------------------
AUC: 0.500
Classification Report on the test data
------------------------------------------------------------------------------------------------------------------------
              precision    recall  f1-score   support

      Normal       0.55      1.00      0.71        27
   Pneumonia       0.00      0.00      0.00        22

    accuracy                           0.55        49
   macro avg       0.28      0.50      0.36        49
weighted avg       0.30      0.55      0.39        49

Classification Report on the validation data
------------------------------------------------------------------------------------------------------------------------
              precision    recall  f1-score   support

      Normal       0.51      1.00      0.68        25
   Pneumonia       0.00      0.00      0.00        24

    accuracy                           0.51        49
   macro avg       0.26      0.50      0.34        49
weighted avg       0.26      0.51      0.34        49

Model 3 Restnet Model

In [ ]:
start = time.time()
In [ ]:
FINAL_MODEL = "Restnet_final.h5"
print('Lets fit the Restnet model.....')
K.clear_session()
#model = buildModel(ResNet50)
model = build_restnet_model()
callbacks = callback_model(FINAL_MODEL)
train_generator = generators.train_generator
validation_generator = generators.valid_generator
test_generator = generators.test_generator    
history = model.fit_generator(generator = train_generator, 
                              steps_per_epoch = generators.step_size_train,
                              epochs = EPOCH, verbose = VERBOSE, 
                              callbacks = callbacks,
                              validation_data = validation_generator, 
                              validation_steps = generators.step_size_valid)
print('Save the final weights'); print('--'*40)
model.save(MODEL_WEIGHTS + FINAL_MODEL)
Lets fit the Restnet model.....
Create a Trainable RestNet model
--------------------------------------------------------------------------------
Model: "model"
__________________________________________________________________________________________________
Layer (type)                    Output Shape         Param #     Connected to                     
==================================================================================================
input_1 (InputLayer)            [(None, 224, 224, 3) 0                                            
__________________________________________________________________________________________________
conv1_pad (ZeroPadding2D)       (None, 230, 230, 3)  0           input_1[0][0]                    
__________________________________________________________________________________________________
conv1_conv (Conv2D)             (None, 112, 112, 64) 9472        conv1_pad[0][0]                  
__________________________________________________________________________________________________
conv1_bn (BatchNormalization)   (None, 112, 112, 64) 256         conv1_conv[0][0]                 
__________________________________________________________________________________________________
conv1_relu (Activation)         (None, 112, 112, 64) 0           conv1_bn[0][0]                   
__________________________________________________________________________________________________
pool1_pad (ZeroPadding2D)       (None, 114, 114, 64) 0           conv1_relu[0][0]                 
__________________________________________________________________________________________________
pool1_pool (MaxPooling2D)       (None, 56, 56, 64)   0           pool1_pad[0][0]                  
__________________________________________________________________________________________________
conv2_block1_1_conv (Conv2D)    (None, 56, 56, 64)   4160        pool1_pool[0][0]                 
__________________________________________________________________________________________________
conv2_block1_1_bn (BatchNormali (None, 56, 56, 64)   256         conv2_block1_1_conv[0][0]        
__________________________________________________________________________________________________
conv2_block1_1_relu (Activation (None, 56, 56, 64)   0           conv2_block1_1_bn[0][0]          
__________________________________________________________________________________________________
conv2_block1_2_conv (Conv2D)    (None, 56, 56, 64)   36928       conv2_block1_1_relu[0][0]        
__________________________________________________________________________________________________
conv2_block1_2_bn (BatchNormali (None, 56, 56, 64)   256         conv2_block1_2_conv[0][0]        
__________________________________________________________________________________________________
conv2_block1_2_relu (Activation (None, 56, 56, 64)   0           conv2_block1_2_bn[0][0]          
__________________________________________________________________________________________________
conv2_block1_0_conv (Conv2D)    (None, 56, 56, 256)  16640       pool1_pool[0][0]                 
__________________________________________________________________________________________________
conv2_block1_3_conv (Conv2D)    (None, 56, 56, 256)  16640       conv2_block1_2_relu[0][0]        
__________________________________________________________________________________________________
conv2_block1_0_bn (BatchNormali (None, 56, 56, 256)  1024        conv2_block1_0_conv[0][0]        
__________________________________________________________________________________________________
conv2_block1_3_bn (BatchNormali (None, 56, 56, 256)  1024        conv2_block1_3_conv[0][0]        
__________________________________________________________________________________________________
conv2_block1_add (Add)          (None, 56, 56, 256)  0           conv2_block1_0_bn[0][0]          
                                                                 conv2_block1_3_bn[0][0]          
__________________________________________________________________________________________________
conv2_block1_out (Activation)   (None, 56, 56, 256)  0           conv2_block1_add[0][0]           
__________________________________________________________________________________________________
conv2_block2_1_conv (Conv2D)    (None, 56, 56, 64)   16448       conv2_block1_out[0][0]           
__________________________________________________________________________________________________
conv2_block2_1_bn (BatchNormali (None, 56, 56, 64)   256         conv2_block2_1_conv[0][0]        
__________________________________________________________________________________________________
conv2_block2_1_relu (Activation (None, 56, 56, 64)   0           conv2_block2_1_bn[0][0]          
__________________________________________________________________________________________________
conv2_block2_2_conv (Conv2D)    (None, 56, 56, 64)   36928       conv2_block2_1_relu[0][0]        
__________________________________________________________________________________________________
conv2_block2_2_bn (BatchNormali (None, 56, 56, 64)   256         conv2_block2_2_conv[0][0]        
__________________________________________________________________________________________________
conv2_block2_2_relu (Activation (None, 56, 56, 64)   0           conv2_block2_2_bn[0][0]          
__________________________________________________________________________________________________
conv2_block2_3_conv (Conv2D)    (None, 56, 56, 256)  16640       conv2_block2_2_relu[0][0]        
__________________________________________________________________________________________________
conv2_block2_3_bn (BatchNormali (None, 56, 56, 256)  1024        conv2_block2_3_conv[0][0]        
__________________________________________________________________________________________________
conv2_block2_add (Add)          (None, 56, 56, 256)  0           conv2_block1_out[0][0]           
                                                                 conv2_block2_3_bn[0][0]          
__________________________________________________________________________________________________
conv2_block2_out (Activation)   (None, 56, 56, 256)  0           conv2_block2_add[0][0]           
__________________________________________________________________________________________________
conv2_block3_1_conv (Conv2D)    (None, 56, 56, 64)   16448       conv2_block2_out[0][0]           
__________________________________________________________________________________________________
conv2_block3_1_bn (BatchNormali (None, 56, 56, 64)   256         conv2_block3_1_conv[0][0]        
__________________________________________________________________________________________________
conv2_block3_1_relu (Activation (None, 56, 56, 64)   0           conv2_block3_1_bn[0][0]          
__________________________________________________________________________________________________
conv2_block3_2_conv (Conv2D)    (None, 56, 56, 64)   36928       conv2_block3_1_relu[0][0]        
__________________________________________________________________________________________________
conv2_block3_2_bn (BatchNormali (None, 56, 56, 64)   256         conv2_block3_2_conv[0][0]        
__________________________________________________________________________________________________
conv2_block3_2_relu (Activation (None, 56, 56, 64)   0           conv2_block3_2_bn[0][0]          
__________________________________________________________________________________________________
conv2_block3_3_conv (Conv2D)    (None, 56, 56, 256)  16640       conv2_block3_2_relu[0][0]        
__________________________________________________________________________________________________
conv2_block3_3_bn (BatchNormali (None, 56, 56, 256)  1024        conv2_block3_3_conv[0][0]        
__________________________________________________________________________________________________
conv2_block3_add (Add)          (None, 56, 56, 256)  0           conv2_block2_out[0][0]           
                                                                 conv2_block3_3_bn[0][0]          
__________________________________________________________________________________________________
conv2_block3_out (Activation)   (None, 56, 56, 256)  0           conv2_block3_add[0][0]           
__________________________________________________________________________________________________
conv3_block1_1_conv (Conv2D)    (None, 28, 28, 128)  32896       conv2_block3_out[0][0]           
__________________________________________________________________________________________________
conv3_block1_1_bn (BatchNormali (None, 28, 28, 128)  512         conv3_block1_1_conv[0][0]        
__________________________________________________________________________________________________
conv3_block1_1_relu (Activation (None, 28, 28, 128)  0           conv3_block1_1_bn[0][0]          
__________________________________________________________________________________________________
conv3_block1_2_conv (Conv2D)    (None, 28, 28, 128)  147584      conv3_block1_1_relu[0][0]        
__________________________________________________________________________________________________
conv3_block1_2_bn (BatchNormali (None, 28, 28, 128)  512         conv3_block1_2_conv[0][0]        
__________________________________________________________________________________________________
conv3_block1_2_relu (Activation (None, 28, 28, 128)  0           conv3_block1_2_bn[0][0]          
__________________________________________________________________________________________________
conv3_block1_0_conv (Conv2D)    (None, 28, 28, 512)  131584      conv2_block3_out[0][0]           
__________________________________________________________________________________________________
conv3_block1_3_conv (Conv2D)    (None, 28, 28, 512)  66048       conv3_block1_2_relu[0][0]        
__________________________________________________________________________________________________
conv3_block1_0_bn (BatchNormali (None, 28, 28, 512)  2048        conv3_block1_0_conv[0][0]        
__________________________________________________________________________________________________
conv3_block1_3_bn (BatchNormali (None, 28, 28, 512)  2048        conv3_block1_3_conv[0][0]        
__________________________________________________________________________________________________
conv3_block1_add (Add)          (None, 28, 28, 512)  0           conv3_block1_0_bn[0][0]          
                                                                 conv3_block1_3_bn[0][0]          
__________________________________________________________________________________________________
conv3_block1_out (Activation)   (None, 28, 28, 512)  0           conv3_block1_add[0][0]           
__________________________________________________________________________________________________
conv3_block2_1_conv (Conv2D)    (None, 28, 28, 128)  65664       conv3_block1_out[0][0]           
__________________________________________________________________________________________________
conv3_block2_1_bn (BatchNormali (None, 28, 28, 128)  512         conv3_block2_1_conv[0][0]        
__________________________________________________________________________________________________
conv3_block2_1_relu (Activation (None, 28, 28, 128)  0           conv3_block2_1_bn[0][0]          
__________________________________________________________________________________________________
conv3_block2_2_conv (Conv2D)    (None, 28, 28, 128)  147584      conv3_block2_1_relu[0][0]        
__________________________________________________________________________________________________
conv3_block2_2_bn (BatchNormali (None, 28, 28, 128)  512         conv3_block2_2_conv[0][0]        
__________________________________________________________________________________________________
conv3_block2_2_relu (Activation (None, 28, 28, 128)  0           conv3_block2_2_bn[0][0]          
__________________________________________________________________________________________________
conv3_block2_3_conv (Conv2D)    (None, 28, 28, 512)  66048       conv3_block2_2_relu[0][0]        
__________________________________________________________________________________________________
conv3_block2_3_bn (BatchNormali (None, 28, 28, 512)  2048        conv3_block2_3_conv[0][0]        
__________________________________________________________________________________________________
conv3_block2_add (Add)          (None, 28, 28, 512)  0           conv3_block1_out[0][0]           
                                                                 conv3_block2_3_bn[0][0]          
__________________________________________________________________________________________________
conv3_block2_out (Activation)   (None, 28, 28, 512)  0           conv3_block2_add[0][0]           
__________________________________________________________________________________________________
conv3_block3_1_conv (Conv2D)    (None, 28, 28, 128)  65664       conv3_block2_out[0][0]           
__________________________________________________________________________________________________
conv3_block3_1_bn (BatchNormali (None, 28, 28, 128)  512         conv3_block3_1_conv[0][0]        
__________________________________________________________________________________________________
conv3_block3_1_relu (Activation (None, 28, 28, 128)  0           conv3_block3_1_bn[0][0]          
__________________________________________________________________________________________________
conv3_block3_2_conv (Conv2D)    (None, 28, 28, 128)  147584      conv3_block3_1_relu[0][0]        
__________________________________________________________________________________________________
conv3_block3_2_bn (BatchNormali (None, 28, 28, 128)  512         conv3_block3_2_conv[0][0]        
__________________________________________________________________________________________________
conv3_block3_2_relu (Activation (None, 28, 28, 128)  0           conv3_block3_2_bn[0][0]          
__________________________________________________________________________________________________
conv3_block3_3_conv (Conv2D)    (None, 28, 28, 512)  66048       conv3_block3_2_relu[0][0]        
__________________________________________________________________________________________________
conv3_block3_3_bn (BatchNormali (None, 28, 28, 512)  2048        conv3_block3_3_conv[0][0]        
__________________________________________________________________________________________________
conv3_block3_add (Add)          (None, 28, 28, 512)  0           conv3_block2_out[0][0]           
                                                                 conv3_block3_3_bn[0][0]          
__________________________________________________________________________________________________
conv3_block3_out (Activation)   (None, 28, 28, 512)  0           conv3_block3_add[0][0]           
__________________________________________________________________________________________________
conv3_block4_1_conv (Conv2D)    (None, 28, 28, 128)  65664       conv3_block3_out[0][0]           
__________________________________________________________________________________________________
conv3_block4_1_bn (BatchNormali (None, 28, 28, 128)  512         conv3_block4_1_conv[0][0]        
__________________________________________________________________________________________________
conv3_block4_1_relu (Activation (None, 28, 28, 128)  0           conv3_block4_1_bn[0][0]          
__________________________________________________________________________________________________
conv3_block4_2_conv (Conv2D)    (None, 28, 28, 128)  147584      conv3_block4_1_relu[0][0]        
__________________________________________________________________________________________________
conv3_block4_2_bn (BatchNormali (None, 28, 28, 128)  512         conv3_block4_2_conv[0][0]        
__________________________________________________________________________________________________
conv3_block4_2_relu (Activation (None, 28, 28, 128)  0           conv3_block4_2_bn[0][0]          
__________________________________________________________________________________________________
conv3_block4_3_conv (Conv2D)    (None, 28, 28, 512)  66048       conv3_block4_2_relu[0][0]        
__________________________________________________________________________________________________
conv3_block4_3_bn (BatchNormali (None, 28, 28, 512)  2048        conv3_block4_3_conv[0][0]        
__________________________________________________________________________________________________
conv3_block4_add (Add)          (None, 28, 28, 512)  0           conv3_block3_out[0][0]           
                                                                 conv3_block4_3_bn[0][0]          
__________________________________________________________________________________________________
conv3_block4_out (Activation)   (None, 28, 28, 512)  0           conv3_block4_add[0][0]           
__________________________________________________________________________________________________
conv4_block1_1_conv (Conv2D)    (None, 14, 14, 256)  131328      conv3_block4_out[0][0]           
__________________________________________________________________________________________________
conv4_block1_1_bn (BatchNormali (None, 14, 14, 256)  1024        conv4_block1_1_conv[0][0]        
__________________________________________________________________________________________________
conv4_block1_1_relu (Activation (None, 14, 14, 256)  0           conv4_block1_1_bn[0][0]          
__________________________________________________________________________________________________
conv4_block1_2_conv (Conv2D)    (None, 14, 14, 256)  590080      conv4_block1_1_relu[0][0]        
__________________________________________________________________________________________________
conv4_block1_2_bn (BatchNormali (None, 14, 14, 256)  1024        conv4_block1_2_conv[0][0]        
__________________________________________________________________________________________________
conv4_block1_2_relu (Activation (None, 14, 14, 256)  0           conv4_block1_2_bn[0][0]          
__________________________________________________________________________________________________
conv4_block1_0_conv (Conv2D)    (None, 14, 14, 1024) 525312      conv3_block4_out[0][0]           
__________________________________________________________________________________________________
conv4_block1_3_conv (Conv2D)    (None, 14, 14, 1024) 263168      conv4_block1_2_relu[0][0]        
__________________________________________________________________________________________________
conv4_block1_0_bn (BatchNormali (None, 14, 14, 1024) 4096        conv4_block1_0_conv[0][0]        
__________________________________________________________________________________________________
conv4_block1_3_bn (BatchNormali (None, 14, 14, 1024) 4096        conv4_block1_3_conv[0][0]        
__________________________________________________________________________________________________
conv4_block1_add (Add)          (None, 14, 14, 1024) 0           conv4_block1_0_bn[0][0]          
                                                                 conv4_block1_3_bn[0][0]          
__________________________________________________________________________________________________
conv4_block1_out (Activation)   (None, 14, 14, 1024) 0           conv4_block1_add[0][0]           
__________________________________________________________________________________________________
conv4_block2_1_conv (Conv2D)    (None, 14, 14, 256)  262400      conv4_block1_out[0][0]           
__________________________________________________________________________________________________
conv4_block2_1_bn (BatchNormali (None, 14, 14, 256)  1024        conv4_block2_1_conv[0][0]        
__________________________________________________________________________________________________
conv4_block2_1_relu (Activation (None, 14, 14, 256)  0           conv4_block2_1_bn[0][0]          
__________________________________________________________________________________________________
conv4_block2_2_conv (Conv2D)    (None, 14, 14, 256)  590080      conv4_block2_1_relu[0][0]        
__________________________________________________________________________________________________
conv4_block2_2_bn (BatchNormali (None, 14, 14, 256)  1024        conv4_block2_2_conv[0][0]        
__________________________________________________________________________________________________
conv4_block2_2_relu (Activation (None, 14, 14, 256)  0           conv4_block2_2_bn[0][0]          
__________________________________________________________________________________________________
conv4_block2_3_conv (Conv2D)    (None, 14, 14, 1024) 263168      conv4_block2_2_relu[0][0]        
__________________________________________________________________________________________________
conv4_block2_3_bn (BatchNormali (None, 14, 14, 1024) 4096        conv4_block2_3_conv[0][0]        
__________________________________________________________________________________________________
conv4_block2_add (Add)          (None, 14, 14, 1024) 0           conv4_block1_out[0][0]           
                                                                 conv4_block2_3_bn[0][0]          
__________________________________________________________________________________________________
conv4_block2_out (Activation)   (None, 14, 14, 1024) 0           conv4_block2_add[0][0]           
__________________________________________________________________________________________________
conv4_block3_1_conv (Conv2D)    (None, 14, 14, 256)  262400      conv4_block2_out[0][0]           
__________________________________________________________________________________________________
conv4_block3_1_bn (BatchNormali (None, 14, 14, 256)  1024        conv4_block3_1_conv[0][0]        
__________________________________________________________________________________________________
conv4_block3_1_relu (Activation (None, 14, 14, 256)  0           conv4_block3_1_bn[0][0]          
__________________________________________________________________________________________________
conv4_block3_2_conv (Conv2D)    (None, 14, 14, 256)  590080      conv4_block3_1_relu[0][0]        
__________________________________________________________________________________________________
conv4_block3_2_bn (BatchNormali (None, 14, 14, 256)  1024        conv4_block3_2_conv[0][0]        
__________________________________________________________________________________________________
conv4_block3_2_relu (Activation (None, 14, 14, 256)  0           conv4_block3_2_bn[0][0]          
__________________________________________________________________________________________________
conv4_block3_3_conv (Conv2D)    (None, 14, 14, 1024) 263168      conv4_block3_2_relu[0][0]        
__________________________________________________________________________________________________
conv4_block3_3_bn (BatchNormali (None, 14, 14, 1024) 4096        conv4_block3_3_conv[0][0]        
__________________________________________________________________________________________________
conv4_block3_add (Add)          (None, 14, 14, 1024) 0           conv4_block2_out[0][0]           
                                                                 conv4_block3_3_bn[0][0]          
__________________________________________________________________________________________________
conv4_block3_out (Activation)   (None, 14, 14, 1024) 0           conv4_block3_add[0][0]           
__________________________________________________________________________________________________
conv4_block4_1_conv (Conv2D)    (None, 14, 14, 256)  262400      conv4_block3_out[0][0]           
__________________________________________________________________________________________________
conv4_block4_1_bn (BatchNormali (None, 14, 14, 256)  1024        conv4_block4_1_conv[0][0]        
__________________________________________________________________________________________________
conv4_block4_1_relu (Activation (None, 14, 14, 256)  0           conv4_block4_1_bn[0][0]          
__________________________________________________________________________________________________
conv4_block4_2_conv (Conv2D)    (None, 14, 14, 256)  590080      conv4_block4_1_relu[0][0]        
__________________________________________________________________________________________________
conv4_block4_2_bn (BatchNormali (None, 14, 14, 256)  1024        conv4_block4_2_conv[0][0]        
__________________________________________________________________________________________________
conv4_block4_2_relu (Activation (None, 14, 14, 256)  0           conv4_block4_2_bn[0][0]          
__________________________________________________________________________________________________
conv4_block4_3_conv (Conv2D)    (None, 14, 14, 1024) 263168      conv4_block4_2_relu[0][0]        
__________________________________________________________________________________________________
conv4_block4_3_bn (BatchNormali (None, 14, 14, 1024) 4096        conv4_block4_3_conv[0][0]        
__________________________________________________________________________________________________
conv4_block4_add (Add)          (None, 14, 14, 1024) 0           conv4_block3_out[0][0]           
                                                                 conv4_block4_3_bn[0][0]          
__________________________________________________________________________________________________
conv4_block4_out (Activation)   (None, 14, 14, 1024) 0           conv4_block4_add[0][0]           
__________________________________________________________________________________________________
conv4_block5_1_conv (Conv2D)    (None, 14, 14, 256)  262400      conv4_block4_out[0][0]           
__________________________________________________________________________________________________
conv4_block5_1_bn (BatchNormali (None, 14, 14, 256)  1024        conv4_block5_1_conv[0][0]        
__________________________________________________________________________________________________
conv4_block5_1_relu (Activation (None, 14, 14, 256)  0           conv4_block5_1_bn[0][0]          
__________________________________________________________________________________________________
conv4_block5_2_conv (Conv2D)    (None, 14, 14, 256)  590080      conv4_block5_1_relu[0][0]        
__________________________________________________________________________________________________
conv4_block5_2_bn (BatchNormali (None, 14, 14, 256)  1024        conv4_block5_2_conv[0][0]        
__________________________________________________________________________________________________
conv4_block5_2_relu (Activation (None, 14, 14, 256)  0           conv4_block5_2_bn[0][0]          
__________________________________________________________________________________________________
conv4_block5_3_conv (Conv2D)    (None, 14, 14, 1024) 263168      conv4_block5_2_relu[0][0]        
__________________________________________________________________________________________________
conv4_block5_3_bn (BatchNormali (None, 14, 14, 1024) 4096        conv4_block5_3_conv[0][0]        
__________________________________________________________________________________________________
conv4_block5_add (Add)          (None, 14, 14, 1024) 0           conv4_block4_out[0][0]           
                                                                 conv4_block5_3_bn[0][0]          
__________________________________________________________________________________________________
conv4_block5_out (Activation)   (None, 14, 14, 1024) 0           conv4_block5_add[0][0]           
__________________________________________________________________________________________________
conv4_block6_1_conv (Conv2D)    (None, 14, 14, 256)  262400      conv4_block5_out[0][0]           
__________________________________________________________________________________________________
conv4_block6_1_bn (BatchNormali (None, 14, 14, 256)  1024        conv4_block6_1_conv[0][0]        
__________________________________________________________________________________________________
conv4_block6_1_relu (Activation (None, 14, 14, 256)  0           conv4_block6_1_bn[0][0]          
__________________________________________________________________________________________________
conv4_block6_2_conv (Conv2D)    (None, 14, 14, 256)  590080      conv4_block6_1_relu[0][0]        
__________________________________________________________________________________________________
conv4_block6_2_bn (BatchNormali (None, 14, 14, 256)  1024        conv4_block6_2_conv[0][0]        
__________________________________________________________________________________________________
conv4_block6_2_relu (Activation (None, 14, 14, 256)  0           conv4_block6_2_bn[0][0]          
__________________________________________________________________________________________________
conv4_block6_3_conv (Conv2D)    (None, 14, 14, 1024) 263168      conv4_block6_2_relu[0][0]        
__________________________________________________________________________________________________
conv4_block6_3_bn (BatchNormali (None, 14, 14, 1024) 4096        conv4_block6_3_conv[0][0]        
__________________________________________________________________________________________________
conv4_block6_add (Add)          (None, 14, 14, 1024) 0           conv4_block5_out[0][0]           
                                                                 conv4_block6_3_bn[0][0]          
__________________________________________________________________________________________________
conv4_block6_out (Activation)   (None, 14, 14, 1024) 0           conv4_block6_add[0][0]           
__________________________________________________________________________________________________
conv5_block1_1_conv (Conv2D)    (None, 7, 7, 512)    524800      conv4_block6_out[0][0]           
__________________________________________________________________________________________________
conv5_block1_1_bn (BatchNormali (None, 7, 7, 512)    2048        conv5_block1_1_conv[0][0]        
__________________________________________________________________________________________________
conv5_block1_1_relu (Activation (None, 7, 7, 512)    0           conv5_block1_1_bn[0][0]          
__________________________________________________________________________________________________
conv5_block1_2_conv (Conv2D)    (None, 7, 7, 512)    2359808     conv5_block1_1_relu[0][0]        
__________________________________________________________________________________________________
conv5_block1_2_bn (BatchNormali (None, 7, 7, 512)    2048        conv5_block1_2_conv[0][0]        
__________________________________________________________________________________________________
conv5_block1_2_relu (Activation (None, 7, 7, 512)    0           conv5_block1_2_bn[0][0]          
__________________________________________________________________________________________________
conv5_block1_0_conv (Conv2D)    (None, 7, 7, 2048)   2099200     conv4_block6_out[0][0]           
__________________________________________________________________________________________________
conv5_block1_3_conv (Conv2D)    (None, 7, 7, 2048)   1050624     conv5_block1_2_relu[0][0]        
__________________________________________________________________________________________________
conv5_block1_0_bn (BatchNormali (None, 7, 7, 2048)   8192        conv5_block1_0_conv[0][0]        
__________________________________________________________________________________________________
conv5_block1_3_bn (BatchNormali (None, 7, 7, 2048)   8192        conv5_block1_3_conv[0][0]        
__________________________________________________________________________________________________
conv5_block1_add (Add)          (None, 7, 7, 2048)   0           conv5_block1_0_bn[0][0]          
                                                                 conv5_block1_3_bn[0][0]          
__________________________________________________________________________________________________
conv5_block1_out (Activation)   (None, 7, 7, 2048)   0           conv5_block1_add[0][0]           
__________________________________________________________________________________________________
conv5_block2_1_conv (Conv2D)    (None, 7, 7, 512)    1049088     conv5_block1_out[0][0]           
__________________________________________________________________________________________________
conv5_block2_1_bn (BatchNormali (None, 7, 7, 512)    2048        conv5_block2_1_conv[0][0]        
__________________________________________________________________________________________________
conv5_block2_1_relu (Activation (None, 7, 7, 512)    0           conv5_block2_1_bn[0][0]          
__________________________________________________________________________________________________
conv5_block2_2_conv (Conv2D)    (None, 7, 7, 512)    2359808     conv5_block2_1_relu[0][0]        
__________________________________________________________________________________________________
conv5_block2_2_bn (BatchNormali (None, 7, 7, 512)    2048        conv5_block2_2_conv[0][0]        
__________________________________________________________________________________________________
conv5_block2_2_relu (Activation (None, 7, 7, 512)    0           conv5_block2_2_bn[0][0]          
__________________________________________________________________________________________________
conv5_block2_3_conv (Conv2D)    (None, 7, 7, 2048)   1050624     conv5_block2_2_relu[0][0]        
__________________________________________________________________________________________________
conv5_block2_3_bn (BatchNormali (None, 7, 7, 2048)   8192        conv5_block2_3_conv[0][0]        
__________________________________________________________________________________________________
conv5_block2_add (Add)          (None, 7, 7, 2048)   0           conv5_block1_out[0][0]           
                                                                 conv5_block2_3_bn[0][0]          
__________________________________________________________________________________________________
conv5_block2_out (Activation)   (None, 7, 7, 2048)   0           conv5_block2_add[0][0]           
__________________________________________________________________________________________________
conv5_block3_1_conv (Conv2D)    (None, 7, 7, 512)    1049088     conv5_block2_out[0][0]           
__________________________________________________________________________________________________
conv5_block3_1_bn (BatchNormali (None, 7, 7, 512)    2048        conv5_block3_1_conv[0][0]        
__________________________________________________________________________________________________
conv5_block3_1_relu (Activation (None, 7, 7, 512)    0           conv5_block3_1_bn[0][0]          
__________________________________________________________________________________________________
conv5_block3_2_conv (Conv2D)    (None, 7, 7, 512)    2359808     conv5_block3_1_relu[0][0]        
__________________________________________________________________________________________________
conv5_block3_2_bn (BatchNormali (None, 7, 7, 512)    2048        conv5_block3_2_conv[0][0]        
__________________________________________________________________________________________________
conv5_block3_2_relu (Activation (None, 7, 7, 512)    0           conv5_block3_2_bn[0][0]          
__________________________________________________________________________________________________
conv5_block3_3_conv (Conv2D)    (None, 7, 7, 2048)   1050624     conv5_block3_2_relu[0][0]        
__________________________________________________________________________________________________
conv5_block3_3_bn (BatchNormali (None, 7, 7, 2048)   8192        conv5_block3_3_conv[0][0]        
__________________________________________________________________________________________________
conv5_block3_add (Add)          (None, 7, 7, 2048)   0           conv5_block2_out[0][0]           
                                                                 conv5_block3_3_bn[0][0]          
__________________________________________________________________________________________________
conv5_block3_out (Activation)   (None, 7, 7, 2048)   0           conv5_block3_add[0][0]           
__________________________________________________________________________________________________
flatten (Flatten)               (None, 100352)       0           conv5_block3_out[0][0]           
__________________________________________________________________________________________________
dense (Dense)                   (None, 1)            100353      flatten[0][0]                    
==================================================================================================
Total params: 23,688,065
Trainable params: 100,353
Non-trainable params: 23,587,712
__________________________________________________________________________________________________
in call backs
Epoch 1/5
12/13 [==========================>...] - ETA: 19s - loss: 5.6556 - accuracy: 0.5994 - average_precision: 0.6564 - f1_score: 0.4370
Epoch 00001: val_loss improved from inf to 7.27298, saving model to model_weights/Restnet_final.h5
13/13 [==============================] - 270s 21s/step - loss: 5.6673 - accuracy: 0.5939 - average_precision: 0.6675 - f1_score: 0.4302 - val_loss: 7.2730 - val_accuracy: 0.5102 - val_average_precision: 0.0000e+00 - val_f1_score: 0.0000e+00
Epoch 2/5
12/13 [==========================>...] - ETA: 17s - loss: 4.4269 - accuracy: 0.6630 - average_precision: 0.7233 - f1_score: 0.5410
Epoch 00002: val_loss did not improve from 7.27298
13/13 [==============================] - 246s 19s/step - loss: 4.3058 - accuracy: 0.6751 - average_precision: 0.7328 - f1_score: 0.5598 - val_loss: 7.2730 - val_accuracy: 0.5102 - val_average_precision: 0.0000e+00 - val_f1_score: 0.0000e+00
Epoch 3/5
12/13 [==========================>...] - ETA: 18s - loss: 3.2809 - accuracy: 0.7735 - average_precision: 0.7711 - f1_score: 0.7484
Epoch 00003: val_loss did not improve from 7.27298
13/13 [==============================] - 260s 20s/step - loss: 3.2139 - accuracy: 0.7766 - average_precision: 0.7743 - f1_score: 0.7533 - val_loss: 7.2730 - val_accuracy: 0.5102 - val_average_precision: 0.0000e+00 - val_f1_score: 0.0000e+00
Epoch 4/5
12/13 [==========================>...] - ETA: 26s - loss: 2.9219 - accuracy: 0.7956 - average_precision: 0.8470 - f1_score: 0.7708
Epoch 00004: val_loss did not improve from 7.27298

Epoch 00004: ReduceLROnPlateau reducing learning rate to 9.999999747378752e-06.
13/13 [==============================] - 366s 28s/step - loss: 2.9863 - accuracy: 0.7919 - average_precision: 0.8347 - f1_score: 0.7680 - val_loss: 7.2730 - val_accuracy: 0.5102 - val_average_precision: 0.0000e+00 - val_f1_score: 0.0000e+00
Epoch 5/5
12/13 [==========================>...] - ETA: 22s - loss: 2.4685 - accuracy: 0.8122 - average_precision: 0.8025 - f1_score: 0.8017
Epoch 00005: val_loss did not improve from 7.27298
13/13 [==============================] - 302s 23s/step - loss: 2.4605 - accuracy: 0.8096 - average_precision: 0.7936 - f1_score: 0.7984 - val_loss: 7.2730 - val_accuracy: 0.5102 - val_average_precision: 0.0000e+00 - val_f1_score: 0.0000e+00
Save the final weights
--------------------------------------------------------------------------------
In [ ]:
totalRestnetTime = "{:.2f}".format(time.time() - start)
print(f'Time: %s secs' %totalRestnetTime)
Time: 1453.18 secs
In [ ]:
loss_Train, accuracy_Train, ap_Train, f1_Train = 0.0, 0.0, 0.0, 0.0
print('Evaluate the model on Train data'); print('--'*40)
train_generator.reset()
loss_Train, accuracy_Train, ap_Train, f1_Train = model.evaluate_generator(generator = train_generator, 
                                          steps = generators.step_size_train)
print(f'Loss: {loss_Train}, Accuracy: {float(accuracy_Train)}, AP: {float(ap_Train)}, F1 Score: {float(f1_Train)}')
Evaluate the model on Train data
--------------------------------------------------------------------------------
Loss: 7.645731595846323, Accuracy: 0.510152280330658, AP: 0.0, F1 Score: 0.0
In [ ]:
loss_Valid, accuracy_Valid, ap_Valid, f1_Valid = 0.0, 0.0, 0.0, 0.0
print('Evaluate the model on validation data'); print('--'*40)
validation_generator.reset()
loss_Valid, accuracy_Valid, ap_Valid, f1_Valid = model.evaluate_generator(generator = validation_generator, 
                                          steps = generators.step_size_valid)
print(f'Loss: {float(loss_Valid)}, Accuracy: {float(accuracy_Valid)}, AP: {float(ap_Valid)}, F1 Score: {float(f1_Valid)}')
Evaluate the model on validation data
--------------------------------------------------------------------------------
Loss: 7.272976636886597, Accuracy: 0.5102040767669678, AP: 0.0, F1 Score: 0.0
In [ ]:
loss_Test, accuracy_Test, ap_Test, f1_Test = 0.0, 0.0, 0.0, 0.0
print('\nEvaluate the model on test data'); print('--'*40)
test_generator.reset()
loss_Test, accuracy_Test, ap_Test, f1_Test = model.evaluate_generator(generator = test_generator, 
                                          steps = generators.step_size_test)
print(f'Loss: {float(loss_Test)}, Accuracy: {float(accuracy_Test)}, AP: {float(ap_Test)}, F1 Score: {float(f1_Test)}')
Evaluate the model on test data
--------------------------------------------------------------------------------
Loss: 6.790946960449219, Accuracy: 0.5510203838348389, AP: 0.0, F1 Score: 0.0
In [ ]:
print(accuracy_Train, accuracy_Valid, accuracy_Test, loss_Test, totalRestnetTime, f1_Test)
df_model_results = populateModelResults(df_model_results, "Resnet50", accuracy_Train, accuracy_Valid, 
                                        accuracy_Test, loss_Test, totalRestnetTime, f1_Test)
df_model_results.head(10)
0.5101523 0.5102041 0.5510204 6.790946960449219 1453.18 0.0
Out[ ]:
Model_Name Train_Accuracy Validation_Accuracy Test_Accuracy Loss Total_Time_Secs f1_Score
0 DenseNet121 0.489848 0.489796 0.44898 0.927249 2651.06 0.610816
1 Updated VGG16 0.489848 0.489796 0.44898 1.381736 46.19 0.610816
2 VGG16 0.510152 0.510204 0.55102 6.790947 2399.36 0.000000
3 Resnet50 0.510152 0.510204 0.55102 6.790947 1453.18 0.000000
In [ ]:
print('Predict on the validation data'); print('--'*40)
validation_generator.reset()
valid_pred_roc = model.predict_generator(generator = validation_generator,
                                         steps = generators.step_size_valid,
                                         verbose = 1)
valid_pred = []
for i in valid_pred_roc:
    if i >= 0.5: valid_pred.append(1)
    else: valid_pred.append(0)
y_valid = df_val['Target'].astype(int).values
Predict on the validation data
--------------------------------------------------------------------------------
2/2 [==============================] - 13s 6s/step
In [ ]:
print('Predict on the test data'); print('--'*40)
test_generator.reset()
test_pred_roc = model.predict_generator(generator = test_generator,
                                        steps = generators.step_size_test,
                                        verbose = 1)
test_pred = []
for i in test_pred_roc:
    if i >= 0.5: test_pred.append(1)
    else: test_pred.append(0)
y_test = df_test['Target'].astype(int).values
Predict on the test data
--------------------------------------------------------------------------------
2/2 [==============================] - 12s 6s/step
In [ ]:
viewPredictedImage(correct)
In [ ]:
viewPredictedImage(incorrect)
In [ ]:
evaluateROC(valid_pred_roc,test_pred_roc,y_valid,y_test)
ROC Curve for the validation data
--------------------------------------------------------------------------------
AUC: 0.500
ROC Curve for the test data
--------------------------------------------------------------------------------
AUC: 0.500
Classification Report on the test data
------------------------------------------------------------------------------------------------------------------------
              precision    recall  f1-score   support

      Normal       0.55      1.00      0.71        27
   Pneumonia       0.00      0.00      0.00        22

    accuracy                           0.55        49
   macro avg       0.28      0.50      0.36        49
weighted avg       0.30      0.55      0.39        49

Classification Report on the validation data
------------------------------------------------------------------------------------------------------------------------
              precision    recall  f1-score   support

      Normal       0.51      1.00      0.68        25
   Pneumonia       0.00      0.00      0.00        24

    accuracy                           0.51        49
   macro avg       0.26      0.50      0.34        49
weighted avg       0.26      0.51      0.34        49

Updated Layer Restnet Model

In [ ]:
start = time.time()
In [ ]:
FINAL_MODEL = "Restnet_updated_final.h5"
print('Lets fit the updated Restnet model.....')
K.clear_session()
#model = buildModel(ResNet50)
model = build_restnet_model()
callbacks = callback_model(FINAL_MODEL)
train_generator = generators.train_generator
validation_generator = generators.valid_generator
test_generator = generators.test_generator    
history = model.fit_generator(generator = train_generator, 
                              steps_per_epoch = generators.step_size_train,
                              epochs = EPOCH, verbose = VERBOSE, 
                              callbacks = callbacks,
                              validation_data = validation_generator, 
                              validation_steps = generators.step_size_valid)
print('Save the final weights'); print('--'*40)
model.save(MODEL_WEIGHTS + FINAL_MODEL)
Lets fit the updated Restnet model.....
Create a Trainable RestNet model
--------------------------------------------------------------------------------
Model: "model"
__________________________________________________________________________________________________
Layer (type)                    Output Shape         Param #     Connected to                     
==================================================================================================
input_1 (InputLayer)            [(None, 224, 224, 3) 0                                            
__________________________________________________________________________________________________
conv1_pad (ZeroPadding2D)       (None, 230, 230, 3)  0           input_1[0][0]                    
__________________________________________________________________________________________________
conv1_conv (Conv2D)             (None, 112, 112, 64) 9472        conv1_pad[0][0]                  
__________________________________________________________________________________________________
conv1_bn (BatchNormalization)   (None, 112, 112, 64) 256         conv1_conv[0][0]                 
__________________________________________________________________________________________________
conv1_relu (Activation)         (None, 112, 112, 64) 0           conv1_bn[0][0]                   
__________________________________________________________________________________________________
pool1_pad (ZeroPadding2D)       (None, 114, 114, 64) 0           conv1_relu[0][0]                 
__________________________________________________________________________________________________
pool1_pool (MaxPooling2D)       (None, 56, 56, 64)   0           pool1_pad[0][0]                  
__________________________________________________________________________________________________
conv2_block1_1_conv (Conv2D)    (None, 56, 56, 64)   4160        pool1_pool[0][0]                 
__________________________________________________________________________________________________
conv2_block1_1_bn (BatchNormali (None, 56, 56, 64)   256         conv2_block1_1_conv[0][0]        
__________________________________________________________________________________________________
conv2_block1_1_relu (Activation (None, 56, 56, 64)   0           conv2_block1_1_bn[0][0]          
__________________________________________________________________________________________________
conv2_block1_2_conv (Conv2D)    (None, 56, 56, 64)   36928       conv2_block1_1_relu[0][0]        
__________________________________________________________________________________________________
conv2_block1_2_bn (BatchNormali (None, 56, 56, 64)   256         conv2_block1_2_conv[0][0]        
__________________________________________________________________________________________________
conv2_block1_2_relu (Activation (None, 56, 56, 64)   0           conv2_block1_2_bn[0][0]          
__________________________________________________________________________________________________
conv2_block1_0_conv (Conv2D)    (None, 56, 56, 256)  16640       pool1_pool[0][0]                 
__________________________________________________________________________________________________
conv2_block1_3_conv (Conv2D)    (None, 56, 56, 256)  16640       conv2_block1_2_relu[0][0]        
__________________________________________________________________________________________________
conv2_block1_0_bn (BatchNormali (None, 56, 56, 256)  1024        conv2_block1_0_conv[0][0]        
__________________________________________________________________________________________________
conv2_block1_3_bn (BatchNormali (None, 56, 56, 256)  1024        conv2_block1_3_conv[0][0]        
__________________________________________________________________________________________________
conv2_block1_add (Add)          (None, 56, 56, 256)  0           conv2_block1_0_bn[0][0]          
                                                                 conv2_block1_3_bn[0][0]          
__________________________________________________________________________________________________
conv2_block1_out (Activation)   (None, 56, 56, 256)  0           conv2_block1_add[0][0]           
__________________________________________________________________________________________________
conv2_block2_1_conv (Conv2D)    (None, 56, 56, 64)   16448       conv2_block1_out[0][0]           
__________________________________________________________________________________________________
conv2_block2_1_bn (BatchNormali (None, 56, 56, 64)   256         conv2_block2_1_conv[0][0]        
__________________________________________________________________________________________________
conv2_block2_1_relu (Activation (None, 56, 56, 64)   0           conv2_block2_1_bn[0][0]          
__________________________________________________________________________________________________
conv2_block2_2_conv (Conv2D)    (None, 56, 56, 64)   36928       conv2_block2_1_relu[0][0]        
__________________________________________________________________________________________________
conv2_block2_2_bn (BatchNormali (None, 56, 56, 64)   256         conv2_block2_2_conv[0][0]        
__________________________________________________________________________________________________
conv2_block2_2_relu (Activation (None, 56, 56, 64)   0           conv2_block2_2_bn[0][0]          
__________________________________________________________________________________________________
conv2_block2_3_conv (Conv2D)    (None, 56, 56, 256)  16640       conv2_block2_2_relu[0][0]        
__________________________________________________________________________________________________
conv2_block2_3_bn (BatchNormali (None, 56, 56, 256)  1024        conv2_block2_3_conv[0][0]        
__________________________________________________________________________________________________
conv2_block2_add (Add)          (None, 56, 56, 256)  0           conv2_block1_out[0][0]           
                                                                 conv2_block2_3_bn[0][0]          
__________________________________________________________________________________________________
conv2_block2_out (Activation)   (None, 56, 56, 256)  0           conv2_block2_add[0][0]           
__________________________________________________________________________________________________
conv2_block3_1_conv (Conv2D)    (None, 56, 56, 64)   16448       conv2_block2_out[0][0]           
__________________________________________________________________________________________________
conv2_block3_1_bn (BatchNormali (None, 56, 56, 64)   256         conv2_block3_1_conv[0][0]        
__________________________________________________________________________________________________
conv2_block3_1_relu (Activation (None, 56, 56, 64)   0           conv2_block3_1_bn[0][0]          
__________________________________________________________________________________________________
conv2_block3_2_conv (Conv2D)    (None, 56, 56, 64)   36928       conv2_block3_1_relu[0][0]        
__________________________________________________________________________________________________
conv2_block3_2_bn (BatchNormali (None, 56, 56, 64)   256         conv2_block3_2_conv[0][0]        
__________________________________________________________________________________________________
conv2_block3_2_relu (Activation (None, 56, 56, 64)   0           conv2_block3_2_bn[0][0]          
__________________________________________________________________________________________________
conv2_block3_3_conv (Conv2D)    (None, 56, 56, 256)  16640       conv2_block3_2_relu[0][0]        
__________________________________________________________________________________________________
conv2_block3_3_bn (BatchNormali (None, 56, 56, 256)  1024        conv2_block3_3_conv[0][0]        
__________________________________________________________________________________________________
conv2_block3_add (Add)          (None, 56, 56, 256)  0           conv2_block2_out[0][0]           
                                                                 conv2_block3_3_bn[0][0]          
__________________________________________________________________________________________________
conv2_block3_out (Activation)   (None, 56, 56, 256)  0           conv2_block3_add[0][0]           
__________________________________________________________________________________________________
conv3_block1_1_conv (Conv2D)    (None, 28, 28, 128)  32896       conv2_block3_out[0][0]           
__________________________________________________________________________________________________
conv3_block1_1_bn (BatchNormali (None, 28, 28, 128)  512         conv3_block1_1_conv[0][0]        
__________________________________________________________________________________________________
conv3_block1_1_relu (Activation (None, 28, 28, 128)  0           conv3_block1_1_bn[0][0]          
__________________________________________________________________________________________________
conv3_block1_2_conv (Conv2D)    (None, 28, 28, 128)  147584      conv3_block1_1_relu[0][0]        
__________________________________________________________________________________________________
conv3_block1_2_bn (BatchNormali (None, 28, 28, 128)  512         conv3_block1_2_conv[0][0]        
__________________________________________________________________________________________________
conv3_block1_2_relu (Activation (None, 28, 28, 128)  0           conv3_block1_2_bn[0][0]          
__________________________________________________________________________________________________
conv3_block1_0_conv (Conv2D)    (None, 28, 28, 512)  131584      conv2_block3_out[0][0]           
__________________________________________________________________________________________________
conv3_block1_3_conv (Conv2D)    (None, 28, 28, 512)  66048       conv3_block1_2_relu[0][0]        
__________________________________________________________________________________________________
conv3_block1_0_bn (BatchNormali (None, 28, 28, 512)  2048        conv3_block1_0_conv[0][0]        
__________________________________________________________________________________________________
conv3_block1_3_bn (BatchNormali (None, 28, 28, 512)  2048        conv3_block1_3_conv[0][0]        
__________________________________________________________________________________________________
conv3_block1_add (Add)          (None, 28, 28, 512)  0           conv3_block1_0_bn[0][0]          
                                                                 conv3_block1_3_bn[0][0]          
__________________________________________________________________________________________________
conv3_block1_out (Activation)   (None, 28, 28, 512)  0           conv3_block1_add[0][0]           
__________________________________________________________________________________________________
conv3_block2_1_conv (Conv2D)    (None, 28, 28, 128)  65664       conv3_block1_out[0][0]           
__________________________________________________________________________________________________
conv3_block2_1_bn (BatchNormali (None, 28, 28, 128)  512         conv3_block2_1_conv[0][0]        
__________________________________________________________________________________________________
conv3_block2_1_relu (Activation (None, 28, 28, 128)  0           conv3_block2_1_bn[0][0]          
__________________________________________________________________________________________________
conv3_block2_2_conv (Conv2D)    (None, 28, 28, 128)  147584      conv3_block2_1_relu[0][0]        
__________________________________________________________________________________________________
conv3_block2_2_bn (BatchNormali (None, 28, 28, 128)  512         conv3_block2_2_conv[0][0]        
__________________________________________________________________________________________________
conv3_block2_2_relu (Activation (None, 28, 28, 128)  0           conv3_block2_2_bn[0][0]          
__________________________________________________________________________________________________
conv3_block2_3_conv (Conv2D)    (None, 28, 28, 512)  66048       conv3_block2_2_relu[0][0]        
__________________________________________________________________________________________________
conv3_block2_3_bn (BatchNormali (None, 28, 28, 512)  2048        conv3_block2_3_conv[0][0]        
__________________________________________________________________________________________________
conv3_block2_add (Add)          (None, 28, 28, 512)  0           conv3_block1_out[0][0]           
                                                                 conv3_block2_3_bn[0][0]          
__________________________________________________________________________________________________
conv3_block2_out (Activation)   (None, 28, 28, 512)  0           conv3_block2_add[0][0]           
__________________________________________________________________________________________________
conv3_block3_1_conv (Conv2D)    (None, 28, 28, 128)  65664       conv3_block2_out[0][0]           
__________________________________________________________________________________________________
conv3_block3_1_bn (BatchNormali (None, 28, 28, 128)  512         conv3_block3_1_conv[0][0]        
__________________________________________________________________________________________________
conv3_block3_1_relu (Activation (None, 28, 28, 128)  0           conv3_block3_1_bn[0][0]          
__________________________________________________________________________________________________
conv3_block3_2_conv (Conv2D)    (None, 28, 28, 128)  147584      conv3_block3_1_relu[0][0]        
__________________________________________________________________________________________________
conv3_block3_2_bn (BatchNormali (None, 28, 28, 128)  512         conv3_block3_2_conv[0][0]        
__________________________________________________________________________________________________
conv3_block3_2_relu (Activation (None, 28, 28, 128)  0           conv3_block3_2_bn[0][0]          
__________________________________________________________________________________________________
conv3_block3_3_conv (Conv2D)    (None, 28, 28, 512)  66048       conv3_block3_2_relu[0][0]        
__________________________________________________________________________________________________
conv3_block3_3_bn (BatchNormali (None, 28, 28, 512)  2048        conv3_block3_3_conv[0][0]        
__________________________________________________________________________________________________
conv3_block3_add (Add)          (None, 28, 28, 512)  0           conv3_block2_out[0][0]           
                                                                 conv3_block3_3_bn[0][0]          
__________________________________________________________________________________________________
conv3_block3_out (Activation)   (None, 28, 28, 512)  0           conv3_block3_add[0][0]           
__________________________________________________________________________________________________
conv3_block4_1_conv (Conv2D)    (None, 28, 28, 128)  65664       conv3_block3_out[0][0]           
__________________________________________________________________________________________________
conv3_block4_1_bn (BatchNormali (None, 28, 28, 128)  512         conv3_block4_1_conv[0][0]        
__________________________________________________________________________________________________
conv3_block4_1_relu (Activation (None, 28, 28, 128)  0           conv3_block4_1_bn[0][0]          
__________________________________________________________________________________________________
conv3_block4_2_conv (Conv2D)    (None, 28, 28, 128)  147584      conv3_block4_1_relu[0][0]        
__________________________________________________________________________________________________
conv3_block4_2_bn (BatchNormali (None, 28, 28, 128)  512         conv3_block4_2_conv[0][0]        
__________________________________________________________________________________________________
conv3_block4_2_relu (Activation (None, 28, 28, 128)  0           conv3_block4_2_bn[0][0]          
__________________________________________________________________________________________________
conv3_block4_3_conv (Conv2D)    (None, 28, 28, 512)  66048       conv3_block4_2_relu[0][0]        
__________________________________________________________________________________________________
conv3_block4_3_bn (BatchNormali (None, 28, 28, 512)  2048        conv3_block4_3_conv[0][0]        
__________________________________________________________________________________________________
conv3_block4_add (Add)          (None, 28, 28, 512)  0           conv3_block3_out[0][0]           
                                                                 conv3_block4_3_bn[0][0]          
__________________________________________________________________________________________________
conv3_block4_out (Activation)   (None, 28, 28, 512)  0           conv3_block4_add[0][0]           
__________________________________________________________________________________________________
conv4_block1_1_conv (Conv2D)    (None, 14, 14, 256)  131328      conv3_block4_out[0][0]           
__________________________________________________________________________________________________
conv4_block1_1_bn (BatchNormali (None, 14, 14, 256)  1024        conv4_block1_1_conv[0][0]        
__________________________________________________________________________________________________
conv4_block1_1_relu (Activation (None, 14, 14, 256)  0           conv4_block1_1_bn[0][0]          
__________________________________________________________________________________________________
conv4_block1_2_conv (Conv2D)    (None, 14, 14, 256)  590080      conv4_block1_1_relu[0][0]        
__________________________________________________________________________________________________
conv4_block1_2_bn (BatchNormali (None, 14, 14, 256)  1024        conv4_block1_2_conv[0][0]        
__________________________________________________________________________________________________
conv4_block1_2_relu (Activation (None, 14, 14, 256)  0           conv4_block1_2_bn[0][0]          
__________________________________________________________________________________________________
conv4_block1_0_conv (Conv2D)    (None, 14, 14, 1024) 525312      conv3_block4_out[0][0]           
__________________________________________________________________________________________________
conv4_block1_3_conv (Conv2D)    (None, 14, 14, 1024) 263168      conv4_block1_2_relu[0][0]        
__________________________________________________________________________________________________
conv4_block1_0_bn (BatchNormali (None, 14, 14, 1024) 4096        conv4_block1_0_conv[0][0]        
__________________________________________________________________________________________________
conv4_block1_3_bn (BatchNormali (None, 14, 14, 1024) 4096        conv4_block1_3_conv[0][0]        
__________________________________________________________________________________________________
conv4_block1_add (Add)          (None, 14, 14, 1024) 0           conv4_block1_0_bn[0][0]          
                                                                 conv4_block1_3_bn[0][0]          
__________________________________________________________________________________________________
conv4_block1_out (Activation)   (None, 14, 14, 1024) 0           conv4_block1_add[0][0]           
__________________________________________________________________________________________________
conv4_block2_1_conv (Conv2D)    (None, 14, 14, 256)  262400      conv4_block1_out[0][0]           
__________________________________________________________________________________________________
conv4_block2_1_bn (BatchNormali (None, 14, 14, 256)  1024        conv4_block2_1_conv[0][0]        
__________________________________________________________________________________________________
conv4_block2_1_relu (Activation (None, 14, 14, 256)  0           conv4_block2_1_bn[0][0]          
__________________________________________________________________________________________________
conv4_block2_2_conv (Conv2D)    (None, 14, 14, 256)  590080      conv4_block2_1_relu[0][0]        
__________________________________________________________________________________________________
conv4_block2_2_bn (BatchNormali (None, 14, 14, 256)  1024        conv4_block2_2_conv[0][0]        
__________________________________________________________________________________________________
conv4_block2_2_relu (Activation (None, 14, 14, 256)  0           conv4_block2_2_bn[0][0]          
__________________________________________________________________________________________________
conv4_block2_3_conv (Conv2D)    (None, 14, 14, 1024) 263168      conv4_block2_2_relu[0][0]        
__________________________________________________________________________________________________
conv4_block2_3_bn (BatchNormali (None, 14, 14, 1024) 4096        conv4_block2_3_conv[0][0]        
__________________________________________________________________________________________________
conv4_block2_add (Add)          (None, 14, 14, 1024) 0           conv4_block1_out[0][0]           
                                                                 conv4_block2_3_bn[0][0]          
__________________________________________________________________________________________________
conv4_block2_out (Activation)   (None, 14, 14, 1024) 0           conv4_block2_add[0][0]           
__________________________________________________________________________________________________
conv4_block3_1_conv (Conv2D)    (None, 14, 14, 256)  262400      conv4_block2_out[0][0]           
__________________________________________________________________________________________________
conv4_block3_1_bn (BatchNormali (None, 14, 14, 256)  1024        conv4_block3_1_conv[0][0]        
__________________________________________________________________________________________________
conv4_block3_1_relu (Activation (None, 14, 14, 256)  0           conv4_block3_1_bn[0][0]          
__________________________________________________________________________________________________
conv4_block3_2_conv (Conv2D)    (None, 14, 14, 256)  590080      conv4_block3_1_relu[0][0]        
__________________________________________________________________________________________________
conv4_block3_2_bn (BatchNormali (None, 14, 14, 256)  1024        conv4_block3_2_conv[0][0]        
__________________________________________________________________________________________________
conv4_block3_2_relu (Activation (None, 14, 14, 256)  0           conv4_block3_2_bn[0][0]          
__________________________________________________________________________________________________
conv4_block3_3_conv (Conv2D)    (None, 14, 14, 1024) 263168      conv4_block3_2_relu[0][0]        
__________________________________________________________________________________________________
conv4_block3_3_bn (BatchNormali (None, 14, 14, 1024) 4096        conv4_block3_3_conv[0][0]        
__________________________________________________________________________________________________
conv4_block3_add (Add)          (None, 14, 14, 1024) 0           conv4_block2_out[0][0]           
                                                                 conv4_block3_3_bn[0][0]          
__________________________________________________________________________________________________
conv4_block3_out (Activation)   (None, 14, 14, 1024) 0           conv4_block3_add[0][0]           
__________________________________________________________________________________________________
conv4_block4_1_conv (Conv2D)    (None, 14, 14, 256)  262400      conv4_block3_out[0][0]           
__________________________________________________________________________________________________
conv4_block4_1_bn (BatchNormali (None, 14, 14, 256)  1024        conv4_block4_1_conv[0][0]        
__________________________________________________________________________________________________
conv4_block4_1_relu (Activation (None, 14, 14, 256)  0           conv4_block4_1_bn[0][0]          
__________________________________________________________________________________________________
conv4_block4_2_conv (Conv2D)    (None, 14, 14, 256)  590080      conv4_block4_1_relu[0][0]        
__________________________________________________________________________________________________
conv4_block4_2_bn (BatchNormali (None, 14, 14, 256)  1024        conv4_block4_2_conv[0][0]        
__________________________________________________________________________________________________
conv4_block4_2_relu (Activation (None, 14, 14, 256)  0           conv4_block4_2_bn[0][0]          
__________________________________________________________________________________________________
conv4_block4_3_conv (Conv2D)    (None, 14, 14, 1024) 263168      conv4_block4_2_relu[0][0]        
__________________________________________________________________________________________________
conv4_block4_3_bn (BatchNormali (None, 14, 14, 1024) 4096        conv4_block4_3_conv[0][0]        
__________________________________________________________________________________________________
conv4_block4_add (Add)          (None, 14, 14, 1024) 0           conv4_block3_out[0][0]           
                                                                 conv4_block4_3_bn[0][0]          
__________________________________________________________________________________________________
conv4_block4_out (Activation)   (None, 14, 14, 1024) 0           conv4_block4_add[0][0]           
__________________________________________________________________________________________________
conv4_block5_1_conv (Conv2D)    (None, 14, 14, 256)  262400      conv4_block4_out[0][0]           
__________________________________________________________________________________________________
conv4_block5_1_bn (BatchNormali (None, 14, 14, 256)  1024        conv4_block5_1_conv[0][0]        
__________________________________________________________________________________________________
conv4_block5_1_relu (Activation (None, 14, 14, 256)  0           conv4_block5_1_bn[0][0]          
__________________________________________________________________________________________________
conv4_block5_2_conv (Conv2D)    (None, 14, 14, 256)  590080      conv4_block5_1_relu[0][0]        
__________________________________________________________________________________________________
conv4_block5_2_bn (BatchNormali (None, 14, 14, 256)  1024        conv4_block5_2_conv[0][0]        
__________________________________________________________________________________________________
conv4_block5_2_relu (Activation (None, 14, 14, 256)  0           conv4_block5_2_bn[0][0]          
__________________________________________________________________________________________________
conv4_block5_3_conv (Conv2D)    (None, 14, 14, 1024) 263168      conv4_block5_2_relu[0][0]        
__________________________________________________________________________________________________
conv4_block5_3_bn (BatchNormali (None, 14, 14, 1024) 4096        conv4_block5_3_conv[0][0]        
__________________________________________________________________________________________________
conv4_block5_add (Add)          (None, 14, 14, 1024) 0           conv4_block4_out[0][0]           
                                                                 conv4_block5_3_bn[0][0]          
__________________________________________________________________________________________________
conv4_block5_out (Activation)   (None, 14, 14, 1024) 0           conv4_block5_add[0][0]           
__________________________________________________________________________________________________
conv4_block6_1_conv (Conv2D)    (None, 14, 14, 256)  262400      conv4_block5_out[0][0]           
__________________________________________________________________________________________________
conv4_block6_1_bn (BatchNormali (None, 14, 14, 256)  1024        conv4_block6_1_conv[0][0]        
__________________________________________________________________________________________________
conv4_block6_1_relu (Activation (None, 14, 14, 256)  0           conv4_block6_1_bn[0][0]          
__________________________________________________________________________________________________
conv4_block6_2_conv (Conv2D)    (None, 14, 14, 256)  590080      conv4_block6_1_relu[0][0]        
__________________________________________________________________________________________________
conv4_block6_2_bn (BatchNormali (None, 14, 14, 256)  1024        conv4_block6_2_conv[0][0]        
__________________________________________________________________________________________________
conv4_block6_2_relu (Activation (None, 14, 14, 256)  0           conv4_block6_2_bn[0][0]          
__________________________________________________________________________________________________
conv4_block6_3_conv (Conv2D)    (None, 14, 14, 1024) 263168      conv4_block6_2_relu[0][0]        
__________________________________________________________________________________________________
conv4_block6_3_bn (BatchNormali (None, 14, 14, 1024) 4096        conv4_block6_3_conv[0][0]        
__________________________________________________________________________________________________
conv4_block6_add (Add)          (None, 14, 14, 1024) 0           conv4_block5_out[0][0]           
                                                                 conv4_block6_3_bn[0][0]          
__________________________________________________________________________________________________
conv4_block6_out (Activation)   (None, 14, 14, 1024) 0           conv4_block6_add[0][0]           
__________________________________________________________________________________________________
conv5_block1_1_conv (Conv2D)    (None, 7, 7, 512)    524800      conv4_block6_out[0][0]           
__________________________________________________________________________________________________
conv5_block1_1_bn (BatchNormali (None, 7, 7, 512)    2048        conv5_block1_1_conv[0][0]        
__________________________________________________________________________________________________
conv5_block1_1_relu (Activation (None, 7, 7, 512)    0           conv5_block1_1_bn[0][0]          
__________________________________________________________________________________________________
conv5_block1_2_conv (Conv2D)    (None, 7, 7, 512)    2359808     conv5_block1_1_relu[0][0]        
__________________________________________________________________________________________________
conv5_block1_2_bn (BatchNormali (None, 7, 7, 512)    2048        conv5_block1_2_conv[0][0]        
__________________________________________________________________________________________________
conv5_block1_2_relu (Activation (None, 7, 7, 512)    0           conv5_block1_2_bn[0][0]          
__________________________________________________________________________________________________
conv5_block1_0_conv (Conv2D)    (None, 7, 7, 2048)   2099200     conv4_block6_out[0][0]           
__________________________________________________________________________________________________
conv5_block1_3_conv (Conv2D)    (None, 7, 7, 2048)   1050624     conv5_block1_2_relu[0][0]        
__________________________________________________________________________________________________
conv5_block1_0_bn (BatchNormali (None, 7, 7, 2048)   8192        conv5_block1_0_conv[0][0]        
__________________________________________________________________________________________________
conv5_block1_3_bn (BatchNormali (None, 7, 7, 2048)   8192        conv5_block1_3_conv[0][0]        
__________________________________________________________________________________________________
conv5_block1_add (Add)          (None, 7, 7, 2048)   0           conv5_block1_0_bn[0][0]          
                                                                 conv5_block1_3_bn[0][0]          
__________________________________________________________________________________________________
conv5_block1_out (Activation)   (None, 7, 7, 2048)   0           conv5_block1_add[0][0]           
__________________________________________________________________________________________________
conv5_block2_1_conv (Conv2D)    (None, 7, 7, 512)    1049088     conv5_block1_out[0][0]           
__________________________________________________________________________________________________
conv5_block2_1_bn (BatchNormali (None, 7, 7, 512)    2048        conv5_block2_1_conv[0][0]        
__________________________________________________________________________________________________
conv5_block2_1_relu (Activation (None, 7, 7, 512)    0           conv5_block2_1_bn[0][0]          
__________________________________________________________________________________________________
conv5_block2_2_conv (Conv2D)    (None, 7, 7, 512)    2359808     conv5_block2_1_relu[0][0]        
__________________________________________________________________________________________________
conv5_block2_2_bn (BatchNormali (None, 7, 7, 512)    2048        conv5_block2_2_conv[0][0]        
__________________________________________________________________________________________________
conv5_block2_2_relu (Activation (None, 7, 7, 512)    0           conv5_block2_2_bn[0][0]          
__________________________________________________________________________________________________
conv5_block2_3_conv (Conv2D)    (None, 7, 7, 2048)   1050624     conv5_block2_2_relu[0][0]        
__________________________________________________________________________________________________
conv5_block2_3_bn (BatchNormali (None, 7, 7, 2048)   8192        conv5_block2_3_conv[0][0]        
__________________________________________________________________________________________________
conv5_block2_add (Add)          (None, 7, 7, 2048)   0           conv5_block1_out[0][0]           
                                                                 conv5_block2_3_bn[0][0]          
__________________________________________________________________________________________________
conv5_block2_out (Activation)   (None, 7, 7, 2048)   0           conv5_block2_add[0][0]           
__________________________________________________________________________________________________
conv5_block3_1_conv (Conv2D)    (None, 7, 7, 512)    1049088     conv5_block2_out[0][0]           
__________________________________________________________________________________________________
conv5_block3_1_bn (BatchNormali (None, 7, 7, 512)    2048        conv5_block3_1_conv[0][0]        
__________________________________________________________________________________________________
conv5_block3_1_relu (Activation (None, 7, 7, 512)    0           conv5_block3_1_bn[0][0]          
__________________________________________________________________________________________________
conv5_block3_2_conv (Conv2D)    (None, 7, 7, 512)    2359808     conv5_block3_1_relu[0][0]        
__________________________________________________________________________________________________
conv5_block3_2_bn (BatchNormali (None, 7, 7, 512)    2048        conv5_block3_2_conv[0][0]        
__________________________________________________________________________________________________
conv5_block3_2_relu (Activation (None, 7, 7, 512)    0           conv5_block3_2_bn[0][0]          
__________________________________________________________________________________________________
conv5_block3_3_conv (Conv2D)    (None, 7, 7, 2048)   1050624     conv5_block3_2_relu[0][0]        
__________________________________________________________________________________________________
conv5_block3_3_bn (BatchNormali (None, 7, 7, 2048)   8192        conv5_block3_3_conv[0][0]        
__________________________________________________________________________________________________
conv5_block3_add (Add)          (None, 7, 7, 2048)   0           conv5_block2_out[0][0]           
                                                                 conv5_block3_3_bn[0][0]          
__________________________________________________________________________________________________
conv5_block3_out (Activation)   (None, 7, 7, 2048)   0           conv5_block3_add[0][0]           
__________________________________________________________________________________________________
flatten (Flatten)               (None, 100352)       0           conv5_block3_out[0][0]           
__________________________________________________________________________________________________
dense (Dense)                   (None, 1)            100353      flatten[0][0]                    
==================================================================================================
Total params: 23,688,065
Trainable params: 100,353
Non-trainable params: 23,587,712
__________________________________________________________________________________________________
in call backs
Epoch 1/5
12/13 [==========================>...] - ETA: 33s - loss: 6.5168 - accuracy: 0.5110 - average_precision: 0.5102 - f1_score: 0.6261 
Epoch 00001: val_loss improved from inf to 7.27298, saving model to model_weights/Restnet_updated_final.h5
13/13 [==============================] - 449s 35s/step - loss: 6.6068 - accuracy: 0.5102 - average_precision: 0.5095 - f1_score: 0.6292 - val_loss: 7.2730 - val_accuracy: 0.5102 - val_average_precision: 0.0000e+00 - val_f1_score: 0.0000e+00
Epoch 2/5
12/13 [==========================>...] - ETA: 27s - loss: 7.4539 - accuracy: 0.5193 - average_precision: 0.4967 - f1_score: 0.6573
Epoch 00002: val_loss did not improve from 7.27298
13/13 [==============================] - 365s 28s/step - loss: 7.4342 - accuracy: 0.5127 - average_precision: 0.4907 - f1_score: 0.6522 - val_loss: 7.2730 - val_accuracy: 0.5102 - val_average_precision: 0.0000e+00 - val_f1_score: 0.0000e+00
Epoch 3/5
12/13 [==========================>...] - ETA: 20s - loss: 4.8630 - accuracy: 0.6271 - average_precision: 0.6447 - f1_score: 0.6584
Epoch 00003: val_loss did not improve from 7.27298
13/13 [==============================] - 279s 21s/step - loss: 4.7204 - accuracy: 0.6396 - average_precision: 0.6720 - f1_score: 0.6631 - val_loss: 7.2730 - val_accuracy: 0.5102 - val_average_precision: 0.0000e+00 - val_f1_score: 0.0000e+00
Epoch 4/5
12/13 [==========================>...] - ETA: 18s - loss: 3.4906 - accuracy: 0.7265 - average_precision: 0.8167 - f1_score: 0.6580
Epoch 00004: val_loss did not improve from 7.27298

Epoch 00004: ReduceLROnPlateau reducing learning rate to 9.999999747378752e-06.
13/13 [==============================] - 258s 20s/step - loss: 3.5358 - accuracy: 0.7234 - average_precision: 0.8029 - f1_score: 0.6641 - val_loss: 7.2730 - val_accuracy: 0.5102 - val_average_precision: 0.0000e+00 - val_f1_score: 0.0000e+00
Epoch 5/5
12/13 [==========================>...] - ETA: 18s - loss: 3.3594 - accuracy: 0.7597 - average_precision: 0.7042 - f1_score: 0.7516
Epoch 00005: val_loss did not improve from 7.27298
13/13 [==============================] - 260s 20s/step - loss: 3.3908 - accuracy: 0.7538 - average_precision: 0.7179 - f1_score: 0.7515 - val_loss: 7.2730 - val_accuracy: 0.5102 - val_average_precision: 0.0000e+00 - val_f1_score: 0.0000e+00
Save the final weights
--------------------------------------------------------------------------------
In [ ]:
totalRestnetTime = "{:.2f}".format(time.time() - start)
print(f'Time: %s secs' %totalRestnetTime)
Time: 1627.19 secs
In [ ]:
loss_Train, accuracy_Train, ap_Train, f1_Train = 0.0, 0.0, 0.0, 0.0
print('Evaluate the model on Train data'); print('--'*40)
train_generator.reset()
loss_Train, accuracy_Train, ap_Train, f1_Train = model.evaluate_generator(generator = train_generator, 
                                          steps = generators.step_size_train)
print(f'Loss: {loss_Train}, Accuracy: {accuracy_Train}, AP: {ap_Train}, F1 Score: {float(f1_Train)}')
Evaluate the model on Train data
--------------------------------------------------------------------------------
Loss: 7.645731705885667, Accuracy: 0.510152280330658, AP: 0.0, F1 Score: 0.0
In [ ]:
oss_Valid, accuracy_Valid, ap_Valid, f1_Valid = 0.0, 0.0, 0.0, 0.0
loss_Test, accuracy_Test, ap_Test, f1_Test = 0.0, 0.0, 0.0, 0.0
print('Evaluate the model on validation data'); print('--'*40)
validation_generator.reset()
loss_Valid, accuracy_Valid, ap_Valid, f1_Valid = model.evaluate_generator(generator = validation_generator, 
                                          steps = generators.step_size_valid)
print(f'Loss: {float(loss_Valid)}, Accuracy: {float(accuracy_Valid)}, AP: {float(ap_Valid)}, F1 Score: {float(f1_Valid)}')
Evaluate the model on validation data
--------------------------------------------------------------------------------
Loss: 7.272976636886597, Accuracy: 0.5102040767669678, AP: 0.0, F1 Score: 0.0
In [ ]:
print('\nEvaluate the model on test data'); print('--'*40)
test_generator.reset()
loss_Test, accuracy_Test, ap_Test, f1_Test = model.evaluate_generator(generator = test_generator, 
                                          steps = generators.step_size_test)
print(f'Loss: {float(loss_Test)}, Accuracy: {float(accuracy_Test)}, AP: {float(ap_Test)}, F1 Score: {float(f1_Test)}')
Evaluate the model on test data
--------------------------------------------------------------------------------
Loss: 6.790946960449219, Accuracy: 0.5510203838348389, AP: 0.0, F1 Score: 0.0
In [ ]:
df_model_results = populateModelResults(df_model_results, "Updated Resnet50", accuracy_Train, accuracy_Valid, 
                                        accuracy_Test, loss_Test, totalRestnetTime, f1_Test)
df_model_results.head(10)
Out[ ]:
Model_Name Train_Accuracy Validation_Accuracy Test_Accuracy Loss Total_Time_Secs f1_Score
0 DenseNet121 0.489848 0.489796 0.44898 0.927249 2651.06 0.610816
1 Updated VGG16 0.489848 0.489796 0.44898 1.381736 46.19 0.610816
2 VGG16 0.510152 0.510204 0.55102 6.790947 2399.36 0.000000
3 Resnet50 0.510152 0.510204 0.55102 6.790947 1453.18 0.000000
4 Updated Resnet50 0.510152 0.510204 0.55102 6.790947 1627.19 0.000000
In [ ]:
print('Predict on the validation data'); print('--'*40)
validation_generator.reset()
valid_pred_roc = model.predict_generator(generator = validation_generator,
                                         steps = generators.step_size_valid,
                                         verbose = 1)
valid_pred = []
for i in valid_pred_roc:
    if i >= 0.5: valid_pred.append(1)
    else: valid_pred.append(0)
y_valid = df_val['Target'].astype(int).values
Predict on the validation data
--------------------------------------------------------------------------------
2/2 [==============================] - 11s 5s/step
In [ ]:
print('Predict on the test data'); print('--'*40)
test_generator.reset()
test_pred_roc = model.predict_generator(generator = test_generator,
                                        steps = generators.step_size_test,
                                        verbose = 1)
test_pred = []
for i in test_pred_roc:
    if i >= 0.5: test_pred.append(1)
    else: test_pred.append(0)
y_test = df_test['Target'].astype(int).values
Predict on the test data
--------------------------------------------------------------------------------
2/2 [==============================] - 11s 5s/step
In [ ]:
test_pred, y_test, x_test, correct, incorrect, test_pred_roc = evaluateTestData(y_valid)
Predict on the test data
--------------------------------------------------------------------------------
2/2 [==============================] - 11s 5s/step
0    25
1    24
dtype: int64
0    27
1    22
dtype: int64
Correctly predicted 27 images out of 49 images
Predicted 55% test images correctly
In [ ]:
viewPredictedImage(correct)
In [ ]:
viewPredictedImage(incorrect)
In [ ]:
evaluateROC(valid_pred_roc,test_pred_roc,y_valid,y_test)
ROC Curve for the validation data
--------------------------------------------------------------------------------
AUC: 0.500
ROC Curve for the test data
--------------------------------------------------------------------------------
AUC: 0.500
Classification Report on the test data
------------------------------------------------------------------------------------------------------------------------
              precision    recall  f1-score   support

      Normal       0.55      1.00      0.71        27
   Pneumonia       0.00      0.00      0.00        22

    accuracy                           0.55        49
   macro avg       0.28      0.50      0.36        49
weighted avg       0.30      0.55      0.39        49

Classification Report on the validation data
------------------------------------------------------------------------------------------------------------------------
              precision    recall  f1-score   support

      Normal       0.51      1.00      0.68        25
   Pneumonia       0.00      0.00      0.00        24

    accuracy                           0.51        49
   macro avg       0.26      0.50      0.34        49
weighted avg       0.26      0.51      0.34        49

Model 4 InceptionV3 Model

In [ ]:
start = time.time()
In [ ]:
print('Lets fit the InceptionV3 model.....')
FINAL_MODEL = "InceptionV3_final.h5"
K.clear_session()
#model = buildModel(InceptionV3)
model = build_inceptionV3_model()
callbacks = callback_model(FINAL_MODEL)
train_generator = generators.train_generator
validation_generator = generators.valid_generator
test_generator = generators.test_generator    
history = model.fit_generator(generator = train_generator, 
                              steps_per_epoch = generators.step_size_train,
                              epochs = EPOCH, verbose = VERBOSE, 
                              callbacks = callbacks,
                              validation_data = validation_generator, 
                              validation_steps = generators.step_size_valid)
print('Save the final weights'); print('--'*40)
model.save(MODEL_WEIGHTS + FINAL_MODEL)
Lets fit the InceptionV3 model.....
Create a Inceptionv3 model
--------------------------------------------------------------------------------
Model: "model"
__________________________________________________________________________________________________
Layer (type)                    Output Shape         Param #     Connected to                     
==================================================================================================
input_1 (InputLayer)            [(None, 224, 224, 3) 0                                            
__________________________________________________________________________________________________
conv2d (Conv2D)                 (None, 111, 111, 32) 864         input_1[0][0]                    
__________________________________________________________________________________________________
batch_normalization (BatchNorma (None, 111, 111, 32) 96          conv2d[0][0]                     
__________________________________________________________________________________________________
activation (Activation)         (None, 111, 111, 32) 0           batch_normalization[0][0]        
__________________________________________________________________________________________________
conv2d_1 (Conv2D)               (None, 109, 109, 32) 9216        activation[0][0]                 
__________________________________________________________________________________________________
batch_normalization_1 (BatchNor (None, 109, 109, 32) 96          conv2d_1[0][0]                   
__________________________________________________________________________________________________
activation_1 (Activation)       (None, 109, 109, 32) 0           batch_normalization_1[0][0]      
__________________________________________________________________________________________________
conv2d_2 (Conv2D)               (None, 109, 109, 64) 18432       activation_1[0][0]               
__________________________________________________________________________________________________
batch_normalization_2 (BatchNor (None, 109, 109, 64) 192         conv2d_2[0][0]                   
__________________________________________________________________________________________________
activation_2 (Activation)       (None, 109, 109, 64) 0           batch_normalization_2[0][0]      
__________________________________________________________________________________________________
max_pooling2d (MaxPooling2D)    (None, 54, 54, 64)   0           activation_2[0][0]               
__________________________________________________________________________________________________
conv2d_3 (Conv2D)               (None, 54, 54, 80)   5120        max_pooling2d[0][0]              
__________________________________________________________________________________________________
batch_normalization_3 (BatchNor (None, 54, 54, 80)   240         conv2d_3[0][0]                   
__________________________________________________________________________________________________
activation_3 (Activation)       (None, 54, 54, 80)   0           batch_normalization_3[0][0]      
__________________________________________________________________________________________________
conv2d_4 (Conv2D)               (None, 52, 52, 192)  138240      activation_3[0][0]               
__________________________________________________________________________________________________
batch_normalization_4 (BatchNor (None, 52, 52, 192)  576         conv2d_4[0][0]                   
__________________________________________________________________________________________________
activation_4 (Activation)       (None, 52, 52, 192)  0           batch_normalization_4[0][0]      
__________________________________________________________________________________________________
max_pooling2d_1 (MaxPooling2D)  (None, 25, 25, 192)  0           activation_4[0][0]               
__________________________________________________________________________________________________
conv2d_8 (Conv2D)               (None, 25, 25, 64)   12288       max_pooling2d_1[0][0]            
__________________________________________________________________________________________________
batch_normalization_8 (BatchNor (None, 25, 25, 64)   192         conv2d_8[0][0]                   
__________________________________________________________________________________________________
activation_8 (Activation)       (None, 25, 25, 64)   0           batch_normalization_8[0][0]      
__________________________________________________________________________________________________
conv2d_6 (Conv2D)               (None, 25, 25, 48)   9216        max_pooling2d_1[0][0]            
__________________________________________________________________________________________________
conv2d_9 (Conv2D)               (None, 25, 25, 96)   55296       activation_8[0][0]               
__________________________________________________________________________________________________
batch_normalization_6 (BatchNor (None, 25, 25, 48)   144         conv2d_6[0][0]                   
__________________________________________________________________________________________________
batch_normalization_9 (BatchNor (None, 25, 25, 96)   288         conv2d_9[0][0]                   
__________________________________________________________________________________________________
activation_6 (Activation)       (None, 25, 25, 48)   0           batch_normalization_6[0][0]      
__________________________________________________________________________________________________
activation_9 (Activation)       (None, 25, 25, 96)   0           batch_normalization_9[0][0]      
__________________________________________________________________________________________________
average_pooling2d (AveragePooli (None, 25, 25, 192)  0           max_pooling2d_1[0][0]            
__________________________________________________________________________________________________
conv2d_5 (Conv2D)               (None, 25, 25, 64)   12288       max_pooling2d_1[0][0]            
__________________________________________________________________________________________________
conv2d_7 (Conv2D)               (None, 25, 25, 64)   76800       activation_6[0][0]               
__________________________________________________________________________________________________
conv2d_10 (Conv2D)              (None, 25, 25, 96)   82944       activation_9[0][0]               
__________________________________________________________________________________________________
conv2d_11 (Conv2D)              (None, 25, 25, 32)   6144        average_pooling2d[0][0]          
__________________________________________________________________________________________________
batch_normalization_5 (BatchNor (None, 25, 25, 64)   192         conv2d_5[0][0]                   
__________________________________________________________________________________________________
batch_normalization_7 (BatchNor (None, 25, 25, 64)   192         conv2d_7[0][0]                   
__________________________________________________________________________________________________
batch_normalization_10 (BatchNo (None, 25, 25, 96)   288         conv2d_10[0][0]                  
__________________________________________________________________________________________________
batch_normalization_11 (BatchNo (None, 25, 25, 32)   96          conv2d_11[0][0]                  
__________________________________________________________________________________________________
activation_5 (Activation)       (None, 25, 25, 64)   0           batch_normalization_5[0][0]      
__________________________________________________________________________________________________
activation_7 (Activation)       (None, 25, 25, 64)   0           batch_normalization_7[0][0]      
__________________________________________________________________________________________________
activation_10 (Activation)      (None, 25, 25, 96)   0           batch_normalization_10[0][0]     
__________________________________________________________________________________________________
activation_11 (Activation)      (None, 25, 25, 32)   0           batch_normalization_11[0][0]     
__________________________________________________________________________________________________
mixed0 (Concatenate)            (None, 25, 25, 256)  0           activation_5[0][0]               
                                                                 activation_7[0][0]               
                                                                 activation_10[0][0]              
                                                                 activation_11[0][0]              
__________________________________________________________________________________________________
conv2d_15 (Conv2D)              (None, 25, 25, 64)   16384       mixed0[0][0]                     
__________________________________________________________________________________________________
batch_normalization_15 (BatchNo (None, 25, 25, 64)   192         conv2d_15[0][0]                  
__________________________________________________________________________________________________
activation_15 (Activation)      (None, 25, 25, 64)   0           batch_normalization_15[0][0]     
__________________________________________________________________________________________________
conv2d_13 (Conv2D)              (None, 25, 25, 48)   12288       mixed0[0][0]                     
__________________________________________________________________________________________________
conv2d_16 (Conv2D)              (None, 25, 25, 96)   55296       activation_15[0][0]              
__________________________________________________________________________________________________
batch_normalization_13 (BatchNo (None, 25, 25, 48)   144         conv2d_13[0][0]                  
__________________________________________________________________________________________________
batch_normalization_16 (BatchNo (None, 25, 25, 96)   288         conv2d_16[0][0]                  
__________________________________________________________________________________________________
activation_13 (Activation)      (None, 25, 25, 48)   0           batch_normalization_13[0][0]     
__________________________________________________________________________________________________
activation_16 (Activation)      (None, 25, 25, 96)   0           batch_normalization_16[0][0]     
__________________________________________________________________________________________________
average_pooling2d_1 (AveragePoo (None, 25, 25, 256)  0           mixed0[0][0]                     
__________________________________________________________________________________________________
conv2d_12 (Conv2D)              (None, 25, 25, 64)   16384       mixed0[0][0]                     
__________________________________________________________________________________________________
conv2d_14 (Conv2D)              (None, 25, 25, 64)   76800       activation_13[0][0]              
__________________________________________________________________________________________________
conv2d_17 (Conv2D)              (None, 25, 25, 96)   82944       activation_16[0][0]              
__________________________________________________________________________________________________
conv2d_18 (Conv2D)              (None, 25, 25, 64)   16384       average_pooling2d_1[0][0]        
__________________________________________________________________________________________________
batch_normalization_12 (BatchNo (None, 25, 25, 64)   192         conv2d_12[0][0]                  
__________________________________________________________________________________________________
batch_normalization_14 (BatchNo (None, 25, 25, 64)   192         conv2d_14[0][0]                  
__________________________________________________________________________________________________
batch_normalization_17 (BatchNo (None, 25, 25, 96)   288         conv2d_17[0][0]                  
__________________________________________________________________________________________________
batch_normalization_18 (BatchNo (None, 25, 25, 64)   192         conv2d_18[0][0]                  
__________________________________________________________________________________________________
activation_12 (Activation)      (None, 25, 25, 64)   0           batch_normalization_12[0][0]     
__________________________________________________________________________________________________
activation_14 (Activation)      (None, 25, 25, 64)   0           batch_normalization_14[0][0]     
__________________________________________________________________________________________________
activation_17 (Activation)      (None, 25, 25, 96)   0           batch_normalization_17[0][0]     
__________________________________________________________________________________________________
activation_18 (Activation)      (None, 25, 25, 64)   0           batch_normalization_18[0][0]     
__________________________________________________________________________________________________
mixed1 (Concatenate)            (None, 25, 25, 288)  0           activation_12[0][0]              
                                                                 activation_14[0][0]              
                                                                 activation_17[0][0]              
                                                                 activation_18[0][0]              
__________________________________________________________________________________________________
conv2d_22 (Conv2D)              (None, 25, 25, 64)   18432       mixed1[0][0]                     
__________________________________________________________________________________________________
batch_normalization_22 (BatchNo (None, 25, 25, 64)   192         conv2d_22[0][0]                  
__________________________________________________________________________________________________
activation_22 (Activation)      (None, 25, 25, 64)   0           batch_normalization_22[0][0]     
__________________________________________________________________________________________________
conv2d_20 (Conv2D)              (None, 25, 25, 48)   13824       mixed1[0][0]                     
__________________________________________________________________________________________________
conv2d_23 (Conv2D)              (None, 25, 25, 96)   55296       activation_22[0][0]              
__________________________________________________________________________________________________
batch_normalization_20 (BatchNo (None, 25, 25, 48)   144         conv2d_20[0][0]                  
__________________________________________________________________________________________________
batch_normalization_23 (BatchNo (None, 25, 25, 96)   288         conv2d_23[0][0]                  
__________________________________________________________________________________________________
activation_20 (Activation)      (None, 25, 25, 48)   0           batch_normalization_20[0][0]     
__________________________________________________________________________________________________
activation_23 (Activation)      (None, 25, 25, 96)   0           batch_normalization_23[0][0]     
__________________________________________________________________________________________________
average_pooling2d_2 (AveragePoo (None, 25, 25, 288)  0           mixed1[0][0]                     
__________________________________________________________________________________________________
conv2d_19 (Conv2D)              (None, 25, 25, 64)   18432       mixed1[0][0]                     
__________________________________________________________________________________________________
conv2d_21 (Conv2D)              (None, 25, 25, 64)   76800       activation_20[0][0]              
__________________________________________________________________________________________________
conv2d_24 (Conv2D)              (None, 25, 25, 96)   82944       activation_23[0][0]              
__________________________________________________________________________________________________
conv2d_25 (Conv2D)              (None, 25, 25, 64)   18432       average_pooling2d_2[0][0]        
__________________________________________________________________________________________________
batch_normalization_19 (BatchNo (None, 25, 25, 64)   192         conv2d_19[0][0]                  
__________________________________________________________________________________________________
batch_normalization_21 (BatchNo (None, 25, 25, 64)   192         conv2d_21[0][0]                  
__________________________________________________________________________________________________
batch_normalization_24 (BatchNo (None, 25, 25, 96)   288         conv2d_24[0][0]                  
__________________________________________________________________________________________________
batch_normalization_25 (BatchNo (None, 25, 25, 64)   192         conv2d_25[0][0]                  
__________________________________________________________________________________________________
activation_19 (Activation)      (None, 25, 25, 64)   0           batch_normalization_19[0][0]     
__________________________________________________________________________________________________
activation_21 (Activation)      (None, 25, 25, 64)   0           batch_normalization_21[0][0]     
__________________________________________________________________________________________________
activation_24 (Activation)      (None, 25, 25, 96)   0           batch_normalization_24[0][0]     
__________________________________________________________________________________________________
activation_25 (Activation)      (None, 25, 25, 64)   0           batch_normalization_25[0][0]     
__________________________________________________________________________________________________
mixed2 (Concatenate)            (None, 25, 25, 288)  0           activation_19[0][0]              
                                                                 activation_21[0][0]              
                                                                 activation_24[0][0]              
                                                                 activation_25[0][0]              
__________________________________________________________________________________________________
conv2d_27 (Conv2D)              (None, 25, 25, 64)   18432       mixed2[0][0]                     
__________________________________________________________________________________________________
batch_normalization_27 (BatchNo (None, 25, 25, 64)   192         conv2d_27[0][0]                  
__________________________________________________________________________________________________
activation_27 (Activation)      (None, 25, 25, 64)   0           batch_normalization_27[0][0]     
__________________________________________________________________________________________________
conv2d_28 (Conv2D)              (None, 25, 25, 96)   55296       activation_27[0][0]              
__________________________________________________________________________________________________
batch_normalization_28 (BatchNo (None, 25, 25, 96)   288         conv2d_28[0][0]                  
__________________________________________________________________________________________________
activation_28 (Activation)      (None, 25, 25, 96)   0           batch_normalization_28[0][0]     
__________________________________________________________________________________________________
conv2d_26 (Conv2D)              (None, 12, 12, 384)  995328      mixed2[0][0]                     
__________________________________________________________________________________________________
conv2d_29 (Conv2D)              (None, 12, 12, 96)   82944       activation_28[0][0]              
__________________________________________________________________________________________________
batch_normalization_26 (BatchNo (None, 12, 12, 384)  1152        conv2d_26[0][0]                  
__________________________________________________________________________________________________
batch_normalization_29 (BatchNo (None, 12, 12, 96)   288         conv2d_29[0][0]                  
__________________________________________________________________________________________________
activation_26 (Activation)      (None, 12, 12, 384)  0           batch_normalization_26[0][0]     
__________________________________________________________________________________________________
activation_29 (Activation)      (None, 12, 12, 96)   0           batch_normalization_29[0][0]     
__________________________________________________________________________________________________
max_pooling2d_2 (MaxPooling2D)  (None, 12, 12, 288)  0           mixed2[0][0]                     
__________________________________________________________________________________________________
mixed3 (Concatenate)            (None, 12, 12, 768)  0           activation_26[0][0]              
                                                                 activation_29[0][0]              
                                                                 max_pooling2d_2[0][0]            
__________________________________________________________________________________________________
conv2d_34 (Conv2D)              (None, 12, 12, 128)  98304       mixed3[0][0]                     
__________________________________________________________________________________________________
batch_normalization_34 (BatchNo (None, 12, 12, 128)  384         conv2d_34[0][0]                  
__________________________________________________________________________________________________
activation_34 (Activation)      (None, 12, 12, 128)  0           batch_normalization_34[0][0]     
__________________________________________________________________________________________________
conv2d_35 (Conv2D)              (None, 12, 12, 128)  114688      activation_34[0][0]              
__________________________________________________________________________________________________
batch_normalization_35 (BatchNo (None, 12, 12, 128)  384         conv2d_35[0][0]                  
__________________________________________________________________________________________________
activation_35 (Activation)      (None, 12, 12, 128)  0           batch_normalization_35[0][0]     
__________________________________________________________________________________________________
conv2d_31 (Conv2D)              (None, 12, 12, 128)  98304       mixed3[0][0]                     
__________________________________________________________________________________________________
conv2d_36 (Conv2D)              (None, 12, 12, 128)  114688      activation_35[0][0]              
__________________________________________________________________________________________________
batch_normalization_31 (BatchNo (None, 12, 12, 128)  384         conv2d_31[0][0]                  
__________________________________________________________________________________________________
batch_normalization_36 (BatchNo (None, 12, 12, 128)  384         conv2d_36[0][0]                  
__________________________________________________________________________________________________
activation_31 (Activation)      (None, 12, 12, 128)  0           batch_normalization_31[0][0]     
__________________________________________________________________________________________________
activation_36 (Activation)      (None, 12, 12, 128)  0           batch_normalization_36[0][0]     
__________________________________________________________________________________________________
conv2d_32 (Conv2D)              (None, 12, 12, 128)  114688      activation_31[0][0]              
__________________________________________________________________________________________________
conv2d_37 (Conv2D)              (None, 12, 12, 128)  114688      activation_36[0][0]              
__________________________________________________________________________________________________
batch_normalization_32 (BatchNo (None, 12, 12, 128)  384         conv2d_32[0][0]                  
__________________________________________________________________________________________________
batch_normalization_37 (BatchNo (None, 12, 12, 128)  384         conv2d_37[0][0]                  
__________________________________________________________________________________________________
activation_32 (Activation)      (None, 12, 12, 128)  0           batch_normalization_32[0][0]     
__________________________________________________________________________________________________
activation_37 (Activation)      (None, 12, 12, 128)  0           batch_normalization_37[0][0]     
__________________________________________________________________________________________________
average_pooling2d_3 (AveragePoo (None, 12, 12, 768)  0           mixed3[0][0]                     
__________________________________________________________________________________________________
conv2d_30 (Conv2D)              (None, 12, 12, 192)  147456      mixed3[0][0]                     
__________________________________________________________________________________________________
conv2d_33 (Conv2D)              (None, 12, 12, 192)  172032      activation_32[0][0]              
__________________________________________________________________________________________________
conv2d_38 (Conv2D)              (None, 12, 12, 192)  172032      activation_37[0][0]              
__________________________________________________________________________________________________
conv2d_39 (Conv2D)              (None, 12, 12, 192)  147456      average_pooling2d_3[0][0]        
__________________________________________________________________________________________________
batch_normalization_30 (BatchNo (None, 12, 12, 192)  576         conv2d_30[0][0]                  
__________________________________________________________________________________________________
batch_normalization_33 (BatchNo (None, 12, 12, 192)  576         conv2d_33[0][0]                  
__________________________________________________________________________________________________
batch_normalization_38 (BatchNo (None, 12, 12, 192)  576         conv2d_38[0][0]                  
__________________________________________________________________________________________________
batch_normalization_39 (BatchNo (None, 12, 12, 192)  576         conv2d_39[0][0]                  
__________________________________________________________________________________________________
activation_30 (Activation)      (None, 12, 12, 192)  0           batch_normalization_30[0][0]     
__________________________________________________________________________________________________
activation_33 (Activation)      (None, 12, 12, 192)  0           batch_normalization_33[0][0]     
__________________________________________________________________________________________________
activation_38 (Activation)      (None, 12, 12, 192)  0           batch_normalization_38[0][0]     
__________________________________________________________________________________________________
activation_39 (Activation)      (None, 12, 12, 192)  0           batch_normalization_39[0][0]     
__________________________________________________________________________________________________
mixed4 (Concatenate)            (None, 12, 12, 768)  0           activation_30[0][0]              
                                                                 activation_33[0][0]              
                                                                 activation_38[0][0]              
                                                                 activation_39[0][0]              
__________________________________________________________________________________________________
conv2d_44 (Conv2D)              (None, 12, 12, 160)  122880      mixed4[0][0]                     
__________________________________________________________________________________________________
batch_normalization_44 (BatchNo (None, 12, 12, 160)  480         conv2d_44[0][0]                  
__________________________________________________________________________________________________
activation_44 (Activation)      (None, 12, 12, 160)  0           batch_normalization_44[0][0]     
__________________________________________________________________________________________________
conv2d_45 (Conv2D)              (None, 12, 12, 160)  179200      activation_44[0][0]              
__________________________________________________________________________________________________
batch_normalization_45 (BatchNo (None, 12, 12, 160)  480         conv2d_45[0][0]                  
__________________________________________________________________________________________________
activation_45 (Activation)      (None, 12, 12, 160)  0           batch_normalization_45[0][0]     
__________________________________________________________________________________________________
conv2d_41 (Conv2D)              (None, 12, 12, 160)  122880      mixed4[0][0]                     
__________________________________________________________________________________________________
conv2d_46 (Conv2D)              (None, 12, 12, 160)  179200      activation_45[0][0]              
__________________________________________________________________________________________________
batch_normalization_41 (BatchNo (None, 12, 12, 160)  480         conv2d_41[0][0]                  
__________________________________________________________________________________________________
batch_normalization_46 (BatchNo (None, 12, 12, 160)  480         conv2d_46[0][0]                  
__________________________________________________________________________________________________
activation_41 (Activation)      (None, 12, 12, 160)  0           batch_normalization_41[0][0]     
__________________________________________________________________________________________________
activation_46 (Activation)      (None, 12, 12, 160)  0           batch_normalization_46[0][0]     
__________________________________________________________________________________________________
conv2d_42 (Conv2D)              (None, 12, 12, 160)  179200      activation_41[0][0]              
__________________________________________________________________________________________________
conv2d_47 (Conv2D)              (None, 12, 12, 160)  179200      activation_46[0][0]              
__________________________________________________________________________________________________
batch_normalization_42 (BatchNo (None, 12, 12, 160)  480         conv2d_42[0][0]                  
__________________________________________________________________________________________________
batch_normalization_47 (BatchNo (None, 12, 12, 160)  480         conv2d_47[0][0]                  
__________________________________________________________________________________________________
activation_42 (Activation)      (None, 12, 12, 160)  0           batch_normalization_42[0][0]     
__________________________________________________________________________________________________
activation_47 (Activation)      (None, 12, 12, 160)  0           batch_normalization_47[0][0]     
__________________________________________________________________________________________________
average_pooling2d_4 (AveragePoo (None, 12, 12, 768)  0           mixed4[0][0]                     
__________________________________________________________________________________________________
conv2d_40 (Conv2D)              (None, 12, 12, 192)  147456      mixed4[0][0]                     
__________________________________________________________________________________________________
conv2d_43 (Conv2D)              (None, 12, 12, 192)  215040      activation_42[0][0]              
__________________________________________________________________________________________________
conv2d_48 (Conv2D)              (None, 12, 12, 192)  215040      activation_47[0][0]              
__________________________________________________________________________________________________
conv2d_49 (Conv2D)              (None, 12, 12, 192)  147456      average_pooling2d_4[0][0]        
__________________________________________________________________________________________________
batch_normalization_40 (BatchNo (None, 12, 12, 192)  576         conv2d_40[0][0]                  
__________________________________________________________________________________________________
batch_normalization_43 (BatchNo (None, 12, 12, 192)  576         conv2d_43[0][0]                  
__________________________________________________________________________________________________
batch_normalization_48 (BatchNo (None, 12, 12, 192)  576         conv2d_48[0][0]                  
__________________________________________________________________________________________________
batch_normalization_49 (BatchNo (None, 12, 12, 192)  576         conv2d_49[0][0]                  
__________________________________________________________________________________________________
activation_40 (Activation)      (None, 12, 12, 192)  0           batch_normalization_40[0][0]     
__________________________________________________________________________________________________
activation_43 (Activation)      (None, 12, 12, 192)  0           batch_normalization_43[0][0]     
__________________________________________________________________________________________________
activation_48 (Activation)      (None, 12, 12, 192)  0           batch_normalization_48[0][0]     
__________________________________________________________________________________________________
activation_49 (Activation)      (None, 12, 12, 192)  0           batch_normalization_49[0][0]     
__________________________________________________________________________________________________
mixed5 (Concatenate)            (None, 12, 12, 768)  0           activation_40[0][0]              
                                                                 activation_43[0][0]              
                                                                 activation_48[0][0]              
                                                                 activation_49[0][0]              
__________________________________________________________________________________________________
conv2d_54 (Conv2D)              (None, 12, 12, 160)  122880      mixed5[0][0]                     
__________________________________________________________________________________________________
batch_normalization_54 (BatchNo (None, 12, 12, 160)  480         conv2d_54[0][0]                  
__________________________________________________________________________________________________
activation_54 (Activation)      (None, 12, 12, 160)  0           batch_normalization_54[0][0]     
__________________________________________________________________________________________________
conv2d_55 (Conv2D)              (None, 12, 12, 160)  179200      activation_54[0][0]              
__________________________________________________________________________________________________
batch_normalization_55 (BatchNo (None, 12, 12, 160)  480         conv2d_55[0][0]                  
__________________________________________________________________________________________________
activation_55 (Activation)      (None, 12, 12, 160)  0           batch_normalization_55[0][0]     
__________________________________________________________________________________________________
conv2d_51 (Conv2D)              (None, 12, 12, 160)  122880      mixed5[0][0]                     
__________________________________________________________________________________________________
conv2d_56 (Conv2D)              (None, 12, 12, 160)  179200      activation_55[0][0]              
__________________________________________________________________________________________________
batch_normalization_51 (BatchNo (None, 12, 12, 160)  480         conv2d_51[0][0]                  
__________________________________________________________________________________________________
batch_normalization_56 (BatchNo (None, 12, 12, 160)  480         conv2d_56[0][0]                  
__________________________________________________________________________________________________
activation_51 (Activation)      (None, 12, 12, 160)  0           batch_normalization_51[0][0]     
__________________________________________________________________________________________________
activation_56 (Activation)      (None, 12, 12, 160)  0           batch_normalization_56[0][0]     
__________________________________________________________________________________________________
conv2d_52 (Conv2D)              (None, 12, 12, 160)  179200      activation_51[0][0]              
__________________________________________________________________________________________________
conv2d_57 (Conv2D)              (None, 12, 12, 160)  179200      activation_56[0][0]              
__________________________________________________________________________________________________
batch_normalization_52 (BatchNo (None, 12, 12, 160)  480         conv2d_52[0][0]                  
__________________________________________________________________________________________________
batch_normalization_57 (BatchNo (None, 12, 12, 160)  480         conv2d_57[0][0]                  
__________________________________________________________________________________________________
activation_52 (Activation)      (None, 12, 12, 160)  0           batch_normalization_52[0][0]     
__________________________________________________________________________________________________
activation_57 (Activation)      (None, 12, 12, 160)  0           batch_normalization_57[0][0]     
__________________________________________________________________________________________________
average_pooling2d_5 (AveragePoo (None, 12, 12, 768)  0           mixed5[0][0]                     
__________________________________________________________________________________________________
conv2d_50 (Conv2D)              (None, 12, 12, 192)  147456      mixed5[0][0]                     
__________________________________________________________________________________________________
conv2d_53 (Conv2D)              (None, 12, 12, 192)  215040      activation_52[0][0]              
__________________________________________________________________________________________________
conv2d_58 (Conv2D)              (None, 12, 12, 192)  215040      activation_57[0][0]              
__________________________________________________________________________________________________
conv2d_59 (Conv2D)              (None, 12, 12, 192)  147456      average_pooling2d_5[0][0]        
__________________________________________________________________________________________________
batch_normalization_50 (BatchNo (None, 12, 12, 192)  576         conv2d_50[0][0]                  
__________________________________________________________________________________________________
batch_normalization_53 (BatchNo (None, 12, 12, 192)  576         conv2d_53[0][0]                  
__________________________________________________________________________________________________
batch_normalization_58 (BatchNo (None, 12, 12, 192)  576         conv2d_58[0][0]                  
__________________________________________________________________________________________________
batch_normalization_59 (BatchNo (None, 12, 12, 192)  576         conv2d_59[0][0]                  
__________________________________________________________________________________________________
activation_50 (Activation)      (None, 12, 12, 192)  0           batch_normalization_50[0][0]     
__________________________________________________________________________________________________
activation_53 (Activation)      (None, 12, 12, 192)  0           batch_normalization_53[0][0]     
__________________________________________________________________________________________________
activation_58 (Activation)      (None, 12, 12, 192)  0           batch_normalization_58[0][0]     
__________________________________________________________________________________________________
activation_59 (Activation)      (None, 12, 12, 192)  0           batch_normalization_59[0][0]     
__________________________________________________________________________________________________
mixed6 (Concatenate)            (None, 12, 12, 768)  0           activation_50[0][0]              
                                                                 activation_53[0][0]              
                                                                 activation_58[0][0]              
                                                                 activation_59[0][0]              
__________________________________________________________________________________________________
conv2d_64 (Conv2D)              (None, 12, 12, 192)  147456      mixed6[0][0]                     
__________________________________________________________________________________________________
batch_normalization_64 (BatchNo (None, 12, 12, 192)  576         conv2d_64[0][0]                  
__________________________________________________________________________________________________
activation_64 (Activation)      (None, 12, 12, 192)  0           batch_normalization_64[0][0]     
__________________________________________________________________________________________________
conv2d_65 (Conv2D)              (None, 12, 12, 192)  258048      activation_64[0][0]              
__________________________________________________________________________________________________
batch_normalization_65 (BatchNo (None, 12, 12, 192)  576         conv2d_65[0][0]                  
__________________________________________________________________________________________________
activation_65 (Activation)      (None, 12, 12, 192)  0           batch_normalization_65[0][0]     
__________________________________________________________________________________________________
conv2d_61 (Conv2D)              (None, 12, 12, 192)  147456      mixed6[0][0]                     
__________________________________________________________________________________________________
conv2d_66 (Conv2D)              (None, 12, 12, 192)  258048      activation_65[0][0]              
__________________________________________________________________________________________________
batch_normalization_61 (BatchNo (None, 12, 12, 192)  576         conv2d_61[0][0]                  
__________________________________________________________________________________________________
batch_normalization_66 (BatchNo (None, 12, 12, 192)  576         conv2d_66[0][0]                  
__________________________________________________________________________________________________
activation_61 (Activation)      (None, 12, 12, 192)  0           batch_normalization_61[0][0]     
__________________________________________________________________________________________________
activation_66 (Activation)      (None, 12, 12, 192)  0           batch_normalization_66[0][0]     
__________________________________________________________________________________________________
conv2d_62 (Conv2D)              (None, 12, 12, 192)  258048      activation_61[0][0]              
__________________________________________________________________________________________________
conv2d_67 (Conv2D)              (None, 12, 12, 192)  258048      activation_66[0][0]              
__________________________________________________________________________________________________
batch_normalization_62 (BatchNo (None, 12, 12, 192)  576         conv2d_62[0][0]                  
__________________________________________________________________________________________________
batch_normalization_67 (BatchNo (None, 12, 12, 192)  576         conv2d_67[0][0]                  
__________________________________________________________________________________________________
activation_62 (Activation)      (None, 12, 12, 192)  0           batch_normalization_62[0][0]     
__________________________________________________________________________________________________
activation_67 (Activation)      (None, 12, 12, 192)  0           batch_normalization_67[0][0]     
__________________________________________________________________________________________________
average_pooling2d_6 (AveragePoo (None, 12, 12, 768)  0           mixed6[0][0]                     
__________________________________________________________________________________________________
conv2d_60 (Conv2D)              (None, 12, 12, 192)  147456      mixed6[0][0]                     
__________________________________________________________________________________________________
conv2d_63 (Conv2D)              (None, 12, 12, 192)  258048      activation_62[0][0]              
__________________________________________________________________________________________________
conv2d_68 (Conv2D)              (None, 12, 12, 192)  258048      activation_67[0][0]              
__________________________________________________________________________________________________
conv2d_69 (Conv2D)              (None, 12, 12, 192)  147456      average_pooling2d_6[0][0]        
__________________________________________________________________________________________________
batch_normalization_60 (BatchNo (None, 12, 12, 192)  576         conv2d_60[0][0]                  
__________________________________________________________________________________________________
batch_normalization_63 (BatchNo (None, 12, 12, 192)  576         conv2d_63[0][0]                  
__________________________________________________________________________________________________
batch_normalization_68 (BatchNo (None, 12, 12, 192)  576         conv2d_68[0][0]                  
__________________________________________________________________________________________________
batch_normalization_69 (BatchNo (None, 12, 12, 192)  576         conv2d_69[0][0]                  
__________________________________________________________________________________________________
activation_60 (Activation)      (None, 12, 12, 192)  0           batch_normalization_60[0][0]     
__________________________________________________________________________________________________
activation_63 (Activation)      (None, 12, 12, 192)  0           batch_normalization_63[0][0]     
__________________________________________________________________________________________________
activation_68 (Activation)      (None, 12, 12, 192)  0           batch_normalization_68[0][0]     
__________________________________________________________________________________________________
activation_69 (Activation)      (None, 12, 12, 192)  0           batch_normalization_69[0][0]     
__________________________________________________________________________________________________
mixed7 (Concatenate)            (None, 12, 12, 768)  0           activation_60[0][0]              
                                                                 activation_63[0][0]              
                                                                 activation_68[0][0]              
                                                                 activation_69[0][0]              
__________________________________________________________________________________________________
conv2d_72 (Conv2D)              (None, 12, 12, 192)  147456      mixed7[0][0]                     
__________________________________________________________________________________________________
batch_normalization_72 (BatchNo (None, 12, 12, 192)  576         conv2d_72[0][0]                  
__________________________________________________________________________________________________
activation_72 (Activation)      (None, 12, 12, 192)  0           batch_normalization_72[0][0]     
__________________________________________________________________________________________________
conv2d_73 (Conv2D)              (None, 12, 12, 192)  258048      activation_72[0][0]              
__________________________________________________________________________________________________
batch_normalization_73 (BatchNo (None, 12, 12, 192)  576         conv2d_73[0][0]                  
__________________________________________________________________________________________________
activation_73 (Activation)      (None, 12, 12, 192)  0           batch_normalization_73[0][0]     
__________________________________________________________________________________________________
conv2d_70 (Conv2D)              (None, 12, 12, 192)  147456      mixed7[0][0]                     
__________________________________________________________________________________________________
conv2d_74 (Conv2D)              (None, 12, 12, 192)  258048      activation_73[0][0]              
__________________________________________________________________________________________________
batch_normalization_70 (BatchNo (None, 12, 12, 192)  576         conv2d_70[0][0]                  
__________________________________________________________________________________________________
batch_normalization_74 (BatchNo (None, 12, 12, 192)  576         conv2d_74[0][0]                  
__________________________________________________________________________________________________
activation_70 (Activation)      (None, 12, 12, 192)  0           batch_normalization_70[0][0]     
__________________________________________________________________________________________________
activation_74 (Activation)      (None, 12, 12, 192)  0           batch_normalization_74[0][0]     
__________________________________________________________________________________________________
conv2d_71 (Conv2D)              (None, 5, 5, 320)    552960      activation_70[0][0]              
__________________________________________________________________________________________________
conv2d_75 (Conv2D)              (None, 5, 5, 192)    331776      activation_74[0][0]              
__________________________________________________________________________________________________
batch_normalization_71 (BatchNo (None, 5, 5, 320)    960         conv2d_71[0][0]                  
__________________________________________________________________________________________________
batch_normalization_75 (BatchNo (None, 5, 5, 192)    576         conv2d_75[0][0]                  
__________________________________________________________________________________________________
activation_71 (Activation)      (None, 5, 5, 320)    0           batch_normalization_71[0][0]     
__________________________________________________________________________________________________
activation_75 (Activation)      (None, 5, 5, 192)    0           batch_normalization_75[0][0]     
__________________________________________________________________________________________________
max_pooling2d_3 (MaxPooling2D)  (None, 5, 5, 768)    0           mixed7[0][0]                     
__________________________________________________________________________________________________
mixed8 (Concatenate)            (None, 5, 5, 1280)   0           activation_71[0][0]              
                                                                 activation_75[0][0]              
                                                                 max_pooling2d_3[0][0]            
__________________________________________________________________________________________________
conv2d_80 (Conv2D)              (None, 5, 5, 448)    573440      mixed8[0][0]                     
__________________________________________________________________________________________________
batch_normalization_80 (BatchNo (None, 5, 5, 448)    1344        conv2d_80[0][0]                  
__________________________________________________________________________________________________
activation_80 (Activation)      (None, 5, 5, 448)    0           batch_normalization_80[0][0]     
__________________________________________________________________________________________________
conv2d_77 (Conv2D)              (None, 5, 5, 384)    491520      mixed8[0][0]                     
__________________________________________________________________________________________________
conv2d_81 (Conv2D)              (None, 5, 5, 384)    1548288     activation_80[0][0]              
__________________________________________________________________________________________________
batch_normalization_77 (BatchNo (None, 5, 5, 384)    1152        conv2d_77[0][0]                  
__________________________________________________________________________________________________
batch_normalization_81 (BatchNo (None, 5, 5, 384)    1152        conv2d_81[0][0]                  
__________________________________________________________________________________________________
activation_77 (Activation)      (None, 5, 5, 384)    0           batch_normalization_77[0][0]     
__________________________________________________________________________________________________
activation_81 (Activation)      (None, 5, 5, 384)    0           batch_normalization_81[0][0]     
__________________________________________________________________________________________________
conv2d_78 (Conv2D)              (None, 5, 5, 384)    442368      activation_77[0][0]              
__________________________________________________________________________________________________
conv2d_79 (Conv2D)              (None, 5, 5, 384)    442368      activation_77[0][0]              
__________________________________________________________________________________________________
conv2d_82 (Conv2D)              (None, 5, 5, 384)    442368      activation_81[0][0]              
__________________________________________________________________________________________________
conv2d_83 (Conv2D)              (None, 5, 5, 384)    442368      activation_81[0][0]              
__________________________________________________________________________________________________
average_pooling2d_7 (AveragePoo (None, 5, 5, 1280)   0           mixed8[0][0]                     
__________________________________________________________________________________________________
conv2d_76 (Conv2D)              (None, 5, 5, 320)    409600      mixed8[0][0]                     
__________________________________________________________________________________________________
batch_normalization_78 (BatchNo (None, 5, 5, 384)    1152        conv2d_78[0][0]                  
__________________________________________________________________________________________________
batch_normalization_79 (BatchNo (None, 5, 5, 384)    1152        conv2d_79[0][0]                  
__________________________________________________________________________________________________
batch_normalization_82 (BatchNo (None, 5, 5, 384)    1152        conv2d_82[0][0]                  
__________________________________________________________________________________________________
batch_normalization_83 (BatchNo (None, 5, 5, 384)    1152        conv2d_83[0][0]                  
__________________________________________________________________________________________________
conv2d_84 (Conv2D)              (None, 5, 5, 192)    245760      average_pooling2d_7[0][0]        
__________________________________________________________________________________________________
batch_normalization_76 (BatchNo (None, 5, 5, 320)    960         conv2d_76[0][0]                  
__________________________________________________________________________________________________
activation_78 (Activation)      (None, 5, 5, 384)    0           batch_normalization_78[0][0]     
__________________________________________________________________________________________________
activation_79 (Activation)      (None, 5, 5, 384)    0           batch_normalization_79[0][0]     
__________________________________________________________________________________________________
activation_82 (Activation)      (None, 5, 5, 384)    0           batch_normalization_82[0][0]     
__________________________________________________________________________________________________
activation_83 (Activation)      (None, 5, 5, 384)    0           batch_normalization_83[0][0]     
__________________________________________________________________________________________________
batch_normalization_84 (BatchNo (None, 5, 5, 192)    576         conv2d_84[0][0]                  
__________________________________________________________________________________________________
activation_76 (Activation)      (None, 5, 5, 320)    0           batch_normalization_76[0][0]     
__________________________________________________________________________________________________
mixed9_0 (Concatenate)          (None, 5, 5, 768)    0           activation_78[0][0]              
                                                                 activation_79[0][0]              
__________________________________________________________________________________________________
concatenate (Concatenate)       (None, 5, 5, 768)    0           activation_82[0][0]              
                                                                 activation_83[0][0]              
__________________________________________________________________________________________________
activation_84 (Activation)      (None, 5, 5, 192)    0           batch_normalization_84[0][0]     
__________________________________________________________________________________________________
mixed9 (Concatenate)            (None, 5, 5, 2048)   0           activation_76[0][0]              
                                                                 mixed9_0[0][0]                   
                                                                 concatenate[0][0]                
                                                                 activation_84[0][0]              
__________________________________________________________________________________________________
conv2d_89 (Conv2D)              (None, 5, 5, 448)    917504      mixed9[0][0]                     
__________________________________________________________________________________________________
batch_normalization_89 (BatchNo (None, 5, 5, 448)    1344        conv2d_89[0][0]                  
__________________________________________________________________________________________________
activation_89 (Activation)      (None, 5, 5, 448)    0           batch_normalization_89[0][0]     
__________________________________________________________________________________________________
conv2d_86 (Conv2D)              (None, 5, 5, 384)    786432      mixed9[0][0]                     
__________________________________________________________________________________________________
conv2d_90 (Conv2D)              (None, 5, 5, 384)    1548288     activation_89[0][0]              
__________________________________________________________________________________________________
batch_normalization_86 (BatchNo (None, 5, 5, 384)    1152        conv2d_86[0][0]                  
__________________________________________________________________________________________________
batch_normalization_90 (BatchNo (None, 5, 5, 384)    1152        conv2d_90[0][0]                  
__________________________________________________________________________________________________
activation_86 (Activation)      (None, 5, 5, 384)    0           batch_normalization_86[0][0]     
__________________________________________________________________________________________________
activation_90 (Activation)      (None, 5, 5, 384)    0           batch_normalization_90[0][0]     
__________________________________________________________________________________________________
conv2d_87 (Conv2D)              (None, 5, 5, 384)    442368      activation_86[0][0]              
__________________________________________________________________________________________________
conv2d_88 (Conv2D)              (None, 5, 5, 384)    442368      activation_86[0][0]              
__________________________________________________________________________________________________
conv2d_91 (Conv2D)              (None, 5, 5, 384)    442368      activation_90[0][0]              
__________________________________________________________________________________________________
conv2d_92 (Conv2D)              (None, 5, 5, 384)    442368      activation_90[0][0]              
__________________________________________________________________________________________________
average_pooling2d_8 (AveragePoo (None, 5, 5, 2048)   0           mixed9[0][0]                     
__________________________________________________________________________________________________
conv2d_85 (Conv2D)              (None, 5, 5, 320)    655360      mixed9[0][0]                     
__________________________________________________________________________________________________
batch_normalization_87 (BatchNo (None, 5, 5, 384)    1152        conv2d_87[0][0]                  
__________________________________________________________________________________________________
batch_normalization_88 (BatchNo (None, 5, 5, 384)    1152        conv2d_88[0][0]                  
__________________________________________________________________________________________________
batch_normalization_91 (BatchNo (None, 5, 5, 384)    1152        conv2d_91[0][0]                  
__________________________________________________________________________________________________
batch_normalization_92 (BatchNo (None, 5, 5, 384)    1152        conv2d_92[0][0]                  
__________________________________________________________________________________________________
conv2d_93 (Conv2D)              (None, 5, 5, 192)    393216      average_pooling2d_8[0][0]        
__________________________________________________________________________________________________
batch_normalization_85 (BatchNo (None, 5, 5, 320)    960         conv2d_85[0][0]                  
__________________________________________________________________________________________________
activation_87 (Activation)      (None, 5, 5, 384)    0           batch_normalization_87[0][0]     
__________________________________________________________________________________________________
activation_88 (Activation)      (None, 5, 5, 384)    0           batch_normalization_88[0][0]     
__________________________________________________________________________________________________
activation_91 (Activation)      (None, 5, 5, 384)    0           batch_normalization_91[0][0]     
__________________________________________________________________________________________________
activation_92 (Activation)      (None, 5, 5, 384)    0           batch_normalization_92[0][0]     
__________________________________________________________________________________________________
batch_normalization_93 (BatchNo (None, 5, 5, 192)    576         conv2d_93[0][0]                  
__________________________________________________________________________________________________
activation_85 (Activation)      (None, 5, 5, 320)    0           batch_normalization_85[0][0]     
__________________________________________________________________________________________________
mixed9_1 (Concatenate)          (None, 5, 5, 768)    0           activation_87[0][0]              
                                                                 activation_88[0][0]              
__________________________________________________________________________________________________
concatenate_1 (Concatenate)     (None, 5, 5, 768)    0           activation_91[0][0]              
                                                                 activation_92[0][0]              
__________________________________________________________________________________________________
activation_93 (Activation)      (None, 5, 5, 192)    0           batch_normalization_93[0][0]     
__________________________________________________________________________________________________
mixed10 (Concatenate)           (None, 5, 5, 2048)   0           activation_85[0][0]              
                                                                 mixed9_1[0][0]                   
                                                                 concatenate_1[0][0]              
                                                                 activation_93[0][0]              
__________________________________________________________________________________________________
flatten (Flatten)               (None, 51200)        0           mixed10[0][0]                    
__________________________________________________________________________________________________
dense (Dense)                   (None, 1)            51201       flatten[0][0]                    
==================================================================================================
Total params: 21,853,985
Trainable params: 51,201
Non-trainable params: 21,802,784
__________________________________________________________________________________________________
in call backs
Epoch 1/5
12/13 [==========================>...] - ETA: 11s - loss: 5.8186 - accuracy: 0.5193 - average_precision: 0.5541 - f1_score: 0.2598
Epoch 00001: val_loss improved from inf to 7.27298, saving model to model_weights/InceptionV3_final.h5
13/13 [==============================] - 162s 12s/step - loss: 5.8895 - accuracy: 0.5228 - average_precision: 0.5884 - f1_score: 0.2494 - val_loss: 7.2730 - val_accuracy: 0.5102 - val_average_precision: 0.0000e+00 - val_f1_score: 0.0000e+00
Epoch 2/5
12/13 [==========================>...] - ETA: 10s - loss: 6.6404 - accuracy: 0.5387 - average_precision: 0.6292 - f1_score: 0.1969
Epoch 00002: val_loss did not improve from 7.27298
13/13 [==============================] - 146s 11s/step - loss: 6.5323 - accuracy: 0.5457 - average_precision: 0.6449 - f1_score: 0.2168 - val_loss: 7.2730 - val_accuracy: 0.5102 - val_average_precision: 0.0000e+00 - val_f1_score: 0.0000e+00
Epoch 3/5
12/13 [==========================>...] - ETA: 10s - loss: 3.7814 - accuracy: 0.6215 - average_precision: 0.6562 - f1_score: 0.5232
Epoch 00003: val_loss did not improve from 7.27298
13/13 [==============================] - 146s 11s/step - loss: 3.9320 - accuracy: 0.6117 - average_precision: 0.6442 - f1_score: 0.5125 - val_loss: 7.2730 - val_accuracy: 0.5102 - val_average_precision: 0.0000e+00 - val_f1_score: 0.0000e+00
Epoch 4/5
12/13 [==========================>...] - ETA: 10s - loss: 4.0056 - accuracy: 0.6050 - average_precision: 0.6202 - f1_score: 0.6325
Epoch 00004: val_loss did not improve from 7.27298

Epoch 00004: ReduceLROnPlateau reducing learning rate to 9.999999747378752e-06.
13/13 [==============================] - 149s 11s/step - loss: 3.9485 - accuracy: 0.6066 - average_precision: 0.6284 - f1_score: 0.6278 - val_loss: 7.2730 - val_accuracy: 0.5102 - val_average_precision: 0.0000e+00 - val_f1_score: 0.0000e+00
Epoch 5/5
12/13 [==========================>...] - ETA: 10s - loss: 2.8849 - accuracy: 0.6934 - average_precision: 0.7061 - f1_score: 0.6069
Epoch 00005: val_loss did not improve from 7.27298
13/13 [==============================] - 148s 11s/step - loss: 2.9041 - accuracy: 0.6929 - average_precision: 0.7169 - f1_score: 0.6131 - val_loss: 7.2730 - val_accuracy: 0.5102 - val_average_precision: 0.0000e+00 - val_f1_score: 0.0000e+00
Save the final weights
--------------------------------------------------------------------------------
In [ ]:
totalInceptionTime = "{:.2f}".format(time.time() - start)
print(f'Time: %s secs' %totalInceptionTime)
Time: 767.04 secs
In [ ]:
loss_Train, accuracy_Train, ap_Train, f1_Train = 0.0, 0.0, 0.0, 0.0
print('Evaluate the model on Train data'); print('--'*40)
train_generator.reset()
loss_Train, accuracy_Train, ap_Train, f1_Train = model.evaluate_generator(generator = train_generator, 
                                          steps = generators.step_size_train)
print(f'Loss: {loss_Train}, Accuracy: {float(accuracy_Train)}, AP: {float(ap_Train)}, F1 Score: {float(f1_Train)}')
Evaluate the model on Train data
--------------------------------------------------------------------------------
Loss: 7.300711448375996, Accuracy: 0.510152280330658, AP: 0.0, F1 Score: 0.0
In [ ]:
loss_Valid, accuracy_Valid, ap_Valid, f1_Valid = 0.0, 0.0, 0.0, 0.0
loss_Test, accuracy_Test, ap_Test, f1_Test = 0.0, 0.0, 0.0, 0.0
print('Evaluate the model on validation data'); print('--'*40)
validation_generator.reset()
loss_Valid, accuracy_Valid, ap_Valid, f1_Valid = model.evaluate_generator(generator = validation_generator, 
                                          steps = generators.step_size_valid)
print(f'Loss: {float(loss_Valid)}, Accuracy: {float(accuracy_Valid)}, AP: {float(ap_Valid)}, F1 Score: {float(f1_Valid)}')
Evaluate the model on validation data
--------------------------------------------------------------------------------
Loss: 7.272976636886597, Accuracy: 0.5102040767669678, AP: 0.0, F1 Score: 0.0
In [ ]:
print('\nEvaluate the model on test data'); print('--'*40)
test_generator.reset()
loss_Test, accuracy_Test, ap_Test, f1_Test = model.evaluate_generator(generator = test_generator, 
                                          steps = generators.step_size_test)
print(f'Loss: {float(loss_Test)}, Accuracy: {float(accuracy_Test)}, AP: {float(ap_Test)}, F1 Score: {float(f1_Test)}')
Evaluate the model on test data
--------------------------------------------------------------------------------
Loss: 6.790946960449219, Accuracy: 0.5510203838348389, AP: 0.0, F1 Score: 0.0
In [ ]:
print(accuracy_Train, accuracy_Valid, accuracy_Test, loss_Test, totalInceptionTime, f1_Test)
df_model_results = populateModelResults(df_model_results, "InceptionV3", accuracy_Train, accuracy_Valid, 
                                        accuracy_Test, loss_Test, totalInceptionTime, f1_Test)
df_model_results.head(10)
0.5101523 0.5102041 0.5510204 6.790946960449219 767.04 0.0
Out[ ]:
Model_Name Train_Accuracy Validation_Accuracy Test_Accuracy Loss Total_Time_Secs f1_Score
0 DenseNet121 0.489848 0.489796 0.44898 0.927249 2651.06 0.610816
1 Updated VGG16 0.489848 0.489796 0.44898 1.381736 46.19 0.610816
2 VGG16 0.510152 0.510204 0.55102 6.790947 2399.36 0.000000
3 Resnet50 0.510152 0.510204 0.55102 6.790947 1453.18 0.000000
4 Updated Resnet50 0.510152 0.510204 0.55102 6.790947 1627.19 0.000000
5 InceptionV3 0.510152 0.510204 0.55102 6.790947 767.04 0.000000
In [ ]:
test_pred, y_test, x_test, correct, incorrect, test_pred_roc = evaluateTestData(y_valid)
Predict on the test data
--------------------------------------------------------------------------------
2/2 [==============================] - 5s 3s/step
0    25
1    24
dtype: int64
0    27
1    22
dtype: int64
Correctly predicted 27 images out of 49 images
Predicted 55% test images correctly
In [ ]:
valid_pred, y_valid, x_valid, valid_pred_roc = evaluateValidationData(model)
Evaluate the model on validation data
--------------------------------------------------------------------------------
Predict on the validation data
--------------------------------------------------------------------------------
2/2 [==============================] - 5s 3s/step
In [ ]:
viewPredictedImage(correct)
In [ ]:
viewPredictedImage(incorrect)
In [ ]:
evaluateROC(valid_pred_roc,test_pred_roc,y_valid,y_test)
ROC Curve for the validation data
--------------------------------------------------------------------------------
AUC: 0.500
ROC Curve for the test data
--------------------------------------------------------------------------------
AUC: 0.500
Classification Report on the test data
------------------------------------------------------------------------------------------------------------------------
              precision    recall  f1-score   support

      Normal       0.55      1.00      0.71        27
   Pneumonia       0.00      0.00      0.00        22

    accuracy                           0.55        49
   macro avg       0.28      0.50      0.36        49
weighted avg       0.30      0.55      0.39        49

Classification Report on the validation data
------------------------------------------------------------------------------------------------------------------------
              precision    recall  f1-score   support

      Normal       0.51      1.00      0.68        25
   Pneumonia       0.00      0.00      0.00        24

    accuracy                           0.51        49
   macro avg       0.26      0.50      0.34        49
weighted avg       0.26      0.51      0.34        49

In [ ]: